Contents

Digital Communication
Other Subjects
Section Progress11%

1 of 9 articles

Information Theory Basics

Entropy, information content I = -log2(P), Shannon.

Mohith N
Updated: 19 March 2026
6 min read

Information Theory Basics Quiz

Test your command of self-information, entropy definitions, and Shannon's foundational results.

Question 1 of 3

Q1.A source emits symbol x with probability P(x) = 1/32. What is the self-information of this symbol in bits?