Mutual Information
I(X;Y) = H(X) - H(X|Y), channel relationship.
Mutual Information Quiz
Test your command of mutual information definitions, symmetry properties, and channel interpretations.
Question 1 of 3
Q1.Mutual information I(X;Y) is correctly expressed as:
Related Articles
Information Theory Basics
Entropy, information content I = -log2(P), Shannon.
6 min read
BEC Channel
Binary Erasure Channel properties.
5 min read
BSC Channel
Binary Symmetric Channel, capacity calculation.
10 min read
Differential Entropy
Entropy of continuous random variables.
5 min read
Source Entropy
Entropy H = -sum P*log2(P), maximum entropy conditions.
7 min read