Concentration

Concentration Inequalities

Theorem 17 (Markov's Inequality)

Theorem 18 (Chebyshev's Inequality)

Intuitively, Theorem 18 gives gives a “better” bound than Theorem 17 because it incorporates the variance of the random variable. Using this idea, we can define an even better bound that incorporates information from all moments of the random variable.

Definition 36 (Chernoff Bound)

Convergence

This question is not as straightforward as it seems because random variables are functions, and there are many ways to define the convergence of functions.

Definition 37

One result of almost sure convergence deals with deviations around the mean of many samples.

Theorem 19 (Strong Law of Large Numbers)

The strong law tells us that for any observed realization, there is a point after which there are no deviations from the mean.

Definition 38

A sequence of random variables converges in probability if

Convergence in probability can help us formalize the intuition that we have which says probability is the frequency with which an even happens over many trials of an event.

Theorem 20 (Weak Law of Large Numbers)

By Theorem 20,

meaning over many trials, the empirical frequency is equal to the probility of the event, matching intuition.

Definition 39

A sequence of random variables converges in distribution if

An example of convergence in distribution is the central limit theorem.

Theorem 21 (Central Limit Theorem)

These notions of convergence are not identical, and they do not necessarily imply each other. It is true that almost sure convergence implies convergence in probability, and convergence in probability implies convergence in distribution, but the implication is only one way.

Once we know how a random variable converges, we can then also find how functions of that random variable converge.

Theorem 22 (Continuous Mapping Theorem)

Last updated