Random Processes

Definition 47

The random variables in a stochastic process do not have to be independently and identically distributed. In fact, if they are not, then we can get additional modeling power.

Definition 48

Stationarity is often a good assumption that can simplify systems which have been running for a long period of time.

Discrete Time Markov Chains

Definition 49

In words, a Markov Chain is a sequence of random variables satisfying the Markov Property where probability of being in a state during the next time step only depends on the current state.

Definition 50

Definition 51

The transition matrix encodes the one-step transition probabilities of the Markov Chain.

Theorem 27 (Chapman-Kolmogorov Equation)

One useful thing we can comptue with Markov Chain is when the chain first enters a particular state.

Definition 52

Computing the expected hitting time is an example of a broader type of Markov Chain Analysis called First Step Analysis. In First Step Analysis, we set up a system of equations that relies on the Markov property to generate a system of equations that only look at the first transition in the chain. For expected hitting time, these look like

Properties of Markov Chains

Definition 53

Definition 54

Definition 55

Definition 56

Theorem 28

If the graph of a Markov Chain (transform the state transition diagram by making edges undirected, removing self-loops, and removing multiple edges) is a tree, then the Markov Chain is reversible.

Class Properties

A class property is a property where if one element of a class has the property, all elements of the class have the property. Markov Chains have several of these properties which allow us to classify states.

Definition 57

Definition 58

Recurrence means that we will visit a state infinitely often in the future if we start in that state, while transience means we will only visit the state finitely many times. Recurrence and transience can be easily identified from the transition diagram.

  1. Any finite communicating class which has no edges leaving the class is recurrent

  2. If a state has an edge leading outside its communicating class, then it is transient

  3. If a state is recurrent, then any state it can reach is recurrent

Definition 59

Definition 60

Positive recurrence means we visit a recurrent state so frequently that we spend a positive fraction of time in that state. Null recurrencce means we visit a recurrent state so infrequently (but still infinitely many times) that we spend virtually no time in that state.

Theorem 29

Every irreducible finite state Markov Chain is positive recurrent.

Definition 61

Definition 62

An irreducible markov chain is aperiodic if any state has period 1.

All of the above properties are class properties.

Long-Term Behavior of Markov Chains

Definition 63

Theorem 30

If an irreducible Markov Chain is at stationarity, then the flow-in equals flow-out relationship holds for any cut of the Markov Chain where a cut is a partition of the chain into two disjoint subsets.

Theorem 30 is one useful result can help solve for stationary distributions.

Theorem 31 (Big Theorem for Markov Chains)

  1. All states are positive recurrent and the stationary distribution exists, is unique, and satisfies

Theorem 32 (Convergence Theorem)

Continuous Time Markov Chains

Definition 64

To characterize how a CTMC functions, we need to define some additional quantities.

Definition 65

The jump chain is a DTMC which describes the transition probabilities between states in the CTMC

Essentially, every time we enter a state, we set an alarm clock for all other states, and then jump to the state whose alarm clock goes off first. This equivalent interpretation allows us to summarize a CTMC using the rate matrix.

Definition 66

Class Properties

Just like in DTMCs, we can classify states in the CTMC.

Definition 67

Definition 68

Definition 69

Long Term Behavior of CTMCs

CTMCs also have stationary distributions.

Definition 70

The stationary distribution of the CTMC is also related to the jump chain, but we need to normalize for the hold times.

Theorem 33

Theorem 34 (Big Theorem for CTMCs)

For an irreducible CTMC, exactly one of the following is true.

  1. All states are positive recurrent, a unique stationary distribution exists, and the stationary distribution satisfies

Uniformization

Theorem 35 (Forward Kolmogorov Equation)

Definition 71

It turns out that

Poisson Processes

Definition 72

There are two important metrics which describe counting processes.

Definition 73

Definition 74

Definition 75

The name Poisson comes from the distribution of each varible in the process.

Theorem 36

Theorem 37

Theorem 38

Poisson Processes are the only counting processes with these particular properties.

It turns out that Poisson Processes can be connected with the Order Statistics of Uniform Random Variables.

Theorem 39 (Conditional Distribution of Arrivals)

Two other useful properties of Poisson Processes involve combining and separating them.

Theorem 40 (Poisson Merging)

Theorem 41 (Poisson Splitting)

Last updated