Source Entropy
Entropy H = -sum P*log2(P), maximum entropy conditions.
Source Entropy Quiz
Test your ability to compute entropy and identify conditions for maximum and minimum entropy.
Question 1 of 3
Q1.A discrete source emits 4 symbols with probabilities {1/2, 1/4, 1/8, 1/8}. The entropy H of this source is:
Related Articles
Differential Entropy
Entropy of continuous random variables.
5 min read
Source Coding Theorem
Shannon first theorem, entropy rate, limits of compression.
10 min read
Mutual Information
I(X;Y) = H(X) - H(X|Y), channel relationship.
9 min read
BEC Channel
Binary Erasure Channel properties.
5 min read
BSC Channel
Binary Symmetric Channel, capacity calculation.
10 min read