Skip to content

Matanley/Reading-Notes

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 

Repository files navigation

Reading-Notes

You are what you read, here are mine

Shannon defind the entropy as the smallest possible average size of lossless encoding of the messages sent from the soruce to the destination.

In general, when we need N different values expressed in bits, we need $\log_2N$ bits and we don't need more than this.

If a message type happens 1 out of N times, the above formula gives the minimum size required.

$$log_2N = -log_21/N = -log_2P$$

Combining with the probabilities to get the average size, we will get:

$$Entropy = -\sum_iP(i)log_2P(i)$$

If Entropy is high, the average encoding size is significant which means each message tends to have more information. It is why high entropy is associated with disorder, uncertainty, surprise, unpredictability, amount of information.

About

You are what you read, here are mine

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published