Joint and Conditional Entropy
H(X,Y), H(Y|X), chain rule for entropy.
Joint Conditional Entropy Quiz
Test your ability to apply joint entropy, conditional entropy, and the chain rule for entropy.
Question 1 of 3
Q1.The chain rule for entropy states that H(X,Y) equals:
Related Articles
Differential Entropy
Entropy of continuous random variables.
5 min read
Information Theory Basics
Entropy, information content I = -log2(P), Shannon.
6 min read
BEC Channel
Binary Erasure Channel properties.
5 min read
BSC Channel
Binary Symmetric Channel, capacity calculation.
10 min read
Shannon Limit
Implications of Shannon-Hartley theorem.
6 min read