Information Theory Basics
Entropy, information content I = -log2(P), Shannon.
Information Theory Basics Quiz
Test your command of self-information, entropy definitions, and Shannon's foundational results.
Question 1 of 3
Q1.A source emits symbol x with probability P(x) = 1/32. What is the self-information of this symbol in bits?
Related Articles
Mutual Information
I(X;Y) = H(X) - H(X|Y), channel relationship.
9 min read
Differential Entropy
Entropy of continuous random variables.
5 min read
Shannon Limit
Implications of Shannon-Hartley theorem.
6 min read
BEC Channel
Binary Erasure Channel properties.
5 min read
BSC Channel
Binary Symmetric Channel, capacity calculation.
10 min read