# Introduction to Probability

One key assumption we make is that

$\mathcal{F}$

is a $\sigma$

-algebra containing $\Omega$

, meaning that countably many complements, unions, and intersections of events in $\mathcal{F}$

are also events in $\mathcal{F}$

. The probability measure $P$

must obey **Kolmogorov’s Axioms**.- 1.$\forall A \in \mathcal{F},\ P(A) \geq 0$
- 2.$P(\Omega) = 1$
- 3.If$A_1, A_2, \cdots\in \mathcal{F}$and$\forall i\ne j,\ A_i\bigcap A_j=\emptyset$, then$P\left(\bigcup_{i\geq 1}A_i\right) = \sum_{i\geq1}P(A_i)$

We choose

$\Omega$

and $\mathcal{F}$

to model problems in a way that makes our calculations easy.Intuitively, conditional probabilty is the probability of event

$A$

given that event $B$

has occurred. In terms of probability spaces, it is as if we have taken $(\Omega, \mathcal{F}, P)$

and now have a probabilty measure $P(\cdot|C)$

belonging to the space $(\Omega, \mathcal{F}, P(\cdot|C))$

.If

$P(B)>0$

, then $A, B$

are independent if and only if $P(A|B) = P(A)$

. In other words, knowing $B$

occurred gave no extra information about $A$

.Conditional independence is a special case of independence where

$A$

and $B$

are not necessarily independent in the original probability space which has the measure $P$

, but are independent in the new probability space conditioned on $C$

with the measure $P(\cdot|C)$

.Last modified 1yr ago