Discrete Time Random Processes

Definition 4

Wide-Sense Stationary Random Processes

Definition 5

We call this wide-sense stationary because the mean and covariance do not change as the process evolves. In a strict-sense stationary process, the distribution of each random variable in the process would not change.

Definition 6

Spectral Density

Recall that the Discrete Time Fourier Transform is given by

The Inverse Discrete Time Fourier Transform is given by

Since the DTFT is an infinite summation, it may or may not converge.

Definition 7

This class covers most real-world signals.

Theorem 5

Definition 8

Theorem 6

Tempered distributions like the Dirac Delta function are other functions which are important for computing the DTFT, and they arise from the theory of generalized functions.

Suppose we want to characterize the signal using its DTFT.

Definition 9

Definition 10

We call the DTFT of the autocorrelation the energy spectral density because, by the Inverse DTFT,

Definition 11

The Power Spectral Density of a Wide-Sense Stationary random process is given by

Theorem 7 (Cramer-Khinchin)

Besides giving us a decomposition of a WSS random process, Theorem 7 tells a few important facts.

Z-Spectrum

Recall that the Z-transform converts a discrete-time signal into a complex representation. It is given by

It is a special type of series called a Laurent Series.

Theorem 8

A Laurent Series will converge absolutely on an open annulus

In some cases, it can be useful to only compute the Z-transform of the right side of the signal.

Definition 12

If the Z-transform of the sequence is a rational function, then we can quickly compute what the unilateral Z-transform will be by leveraging its partial fraction decomposition.

Theorem 9

Definition 13

Using this definition, we can see that

Definition 14

Markov Processes

Definition 15

Mathematically, Markov triplets satisfy three properties.

Theorem 10

Definition 16

Because of the conditional independence property, we can write the joint distribution of all states in the Markov process as

Definition 17

Definition 18

All Wide-Sense Markov models have a very succint representation.

Theorem 11

Hidden Markov Processes

Definition 19

Hidden Markov Models can be represented by undirected graphical models. To create an undirected graphical model,

  1. Create a node for each random variable.

  2. Draw an edge between two nodes if a factor of the joint distribution contains both nodes.

Undirected graphical models of Hidden Markov Processes are useful because they let us derive additional Markov dependepencies between groups of variables.

Theorem 12

State-Space Models

Suppose we have a discrete-time random process which evolves in a recursive fashion, meaning the current state depends in some way on the previous state. We can express this recursion with a set of equations.

Definition 20

with initial condition

$$\langle \boldsymbol{X}_i, \boldsymbol{X}_j \rangle  = \begin{cases}                 \Phi_{i,j}\Pi_j & i \geq j \\                 \Pi_i \Phi_{j,i}^* & i \leq j             \end{cases}$$
$$\langle \boldsymbol{Y}_i, \boldsymbol{Y}_j \rangle  = \begin{cases}                  H_i \Phi_{i,j+1}N_j & i > j\\                  R_i + H_i\Pi_iH_i^* & i=j \\                  N_i^*\Phi^*_{j,i+1}H_j^* & i < j             \end{cases} \text{ where } N_i=F_i\Pi_iH_i^*+G_iS_i$$

Last updated