# Random Processes

The random variables in a stochastic process do not have to be independently and identically distributed. In fact, if they are not, then we can get additional modeling power.

Stationarity is often a good assumption that can simplify systems which have been running for a long period of time.

## Discrete Time Markov Chains

In words, a Markov Chain is a sequence of random variables satisfying the Markov Property where probability of being in a state during the next time step only depends on the current state.

The transition matrix encodes the one-step transition probabilities of the Markov Chain.

One useful thing we can comptue with Markov Chain is when the chain first enters a particular state.

Computing the expected hitting time is an example of a broader type of Markov Chain Analysis called **First Step Analysis**. In First Step Analysis, we set up a system of equations that relies on the Markov property to generate a system of equations that only look at the first transition in the chain. For expected hitting time, these look like

### Properties of Markov Chains

### Class Properties

A class property is a property where if one element of a class has the property, all elements of the class have the property. Markov Chains have several of these properties which allow us to classify states.

Recurrence means that we will visit a state infinitely often in the future if we start in that state, while transience means we will only visit the state finitely many times. Recurrence and transience can be easily identified from the transition diagram.

Any finite communicating class which has no edges leaving the class is recurrent

If a state has an edge leading outside its communicating class, then it is transient

If a state is recurrent, then any state it can reach is recurrent

Positive recurrence means we visit a recurrent state so frequently that we spend a positive fraction of time in that state. Null recurrencce means we visit a recurrent state so infrequently (but still infinitely many times) that we spend virtually no time in that state.

All of the above properties are class properties.

### Long-Term Behavior of Markov Chains

Theorem 30 is one useful result can help solve for stationary distributions.

## Continuous Time Markov Chains

To characterize how a CTMC functions, we need to define some additional quantities.

Essentially, every time we enter a state, we set an alarm clock for all other states, and then jump to the state whose alarm clock goes off first. This equivalent interpretation allows us to summarize a CTMC using the rate matrix.

### Class Properties

Just like in DTMCs, we can classify states in the CTMC.

### Long Term Behavior of CTMCs

CTMCs also have stationary distributions.

The stationary distribution of the CTMC is also related to the jump chain, but we need to normalize for the hold times.

### Uniformization

It turns out that

### Poisson Processes

There are two important metrics which describe counting processes.

The name Poisson comes from the distribution of each varible in the process.

Poisson Processes are the only counting processes with these particular properties.

It turns out that Poisson Processes can be connected with the Order Statistics of Uniform Random Variables.

Two other useful properties of Poisson Processes involve combining and separating them.

Last updated