Information Theory

Information Theory is a field which addresses two questions

  1. Source Coding: How many bits do I need to losslessly represent an observation.

  2. Channel Coding: How reliably and quickly can I communicate a message over a noisy channel.

Quantifying Information

Definition 40

Definition 41

Definition 42

Theorem 23 (Chain Rule of Entropy)

In addition to knowing how much our surprise changes for a random variable when we observe a different random variable, we can also quantify how much additional information observing a random variable gives us about another.

Definition 43

Source Coding

Theorem 24 (Asymptotic Equipartition Property)

Definition 44

Two important properties of the typical set are that

This makes the average number of bits required to describe a message

This is the first half of a central result of source coding.

Theorem 25 (Source Coding Theorem)

Channel Coding

Definition 45

In words, the capacity describes the maximum mutual information between the channel input and output.

Definition 46

Theorem 26 (Channel Coding Theorem)

Last updated