Differential Entropy
Entropy of continuous random variables.
Differential Entropy Quiz
Test your grasp of continuous random variable entropy and its properties.
Question 1 of 3
Q1.The differential entropy of a Gaussian random variable X with variance sigma^2 is given by h(X) = (1/2) log(2*pi*e*sigma^2). If the variance is doubled, what is the change in differential entropy?
Related Articles
Source Entropy
Entropy H = -sum P*log2(P), maximum entropy conditions.
7 min read
Joint and Conditional Entropy
H(X,Y), H(Y|X), chain rule for entropy.
5 min read
Information Theory Basics
Entropy, information content I = -log2(P), Shannon.
6 min read
Differential PSK
DPSK modulation and demodulation, non-coherent advantage.
11 min read
Mutual Information
I(X;Y) = H(X) - H(X|Y), channel relationship.
9 min read