Discrete Time Random Processes
Discrete Time Random Processes have a mean function
and an auto-correlation function
We call this wide-sense stationary because the mean and covariance do not change as the process evolves. In a strict-sense stationary process, the distribution of each random variable in the process would not change.
Recall that the Discrete Time Fourier Transform is given by
The Inverse Discrete Time Fourier Transform is given by
Since the DTFT is an infinite summation, it may or may not converge.
This class covers most real-world signals.
The
class contains important functions such as
.
Tempered distributions like the Dirac Delta function are other functions which are important for computing the DTFT, and they arise from the theory of generalized functions.
Suppose we want to characterize the signal using its DTFT.
The autocorrelation of
, given by
, is closely related to the energy of the signal since
.
We call the DTFT of the autocorrelation the energy spectral density because, by the Inverse DTFT,
Since summing over each frequency gives us the energy, we can think of
as storing the energy density of each spectral component of the signal. We can apply this same idea to wide-sense stationary stochastic processes.
Note that when considering stochastic signals, the metric changes from energy to power. This is because if
is Wide-Sense Stationary, then
so energy doesn’t even make sense. To build our notion of power, let
be a truncated DTFT of the auto-correlation of a wide-sense stationary process, then
The DTFT of the auto-correlation function naturally arises out of taking the energy spectral density and normalizing it by time (the truncated sequence is made of
points). In practice, this means to measure the PSD, we need to either use the distribution of the signal to compute
, or estimate the
by averaging multiple realizations of the signal.
The inverse DTFT formula tells us that we can represent a deterministic, discrete-time signal
as a sum of complex exponentials weighted by
. This representation has an analog for stochastic signals as well.
For a complex-valued WSS stochastic process
with power spectral density
, there exists a unique right-continuous stochastic process
with square-integrable, orthogonal increments such that
where for any interval
,
where
is the structural measure of the stochastic process and has Radon-Nikodym derivative
.
Besides giving us a decomposition of a WSS random process, Theorem 7 tells a few important facts.
- 1.(i.e different frequencies are uncorrelated).
- 2.
Recall that the Z-transform converts a discrete-time signal into a complex representation. It is given by
It is a special type of series called a Laurent Series.
We can compute
and
using the signal
.
In some cases, it can be useful to only compute the Z-transform of the right side of the signal.
If the Z-transform of the sequence is a rational function, then we can quickly compute what the unilateral Z-transform will be by leveraging its partial fraction decomposition.
Using this definition, we can see that
We can also look at the Z-transform of the auto-correlation function of a WSS process
to obtain
.
Because
is minimum phase and
, it must take the form
since minimum phase systems are causal. Using Definition 14, we can express
as the product of a right-sided and left-sided process.
Note that
. Using the assumptions built into Definition 14, we can find a general form for
since we know
takes the following form
If we let the
and
terms be part of
, then
Mathematically, Markov triplets satisfy three properties.
- 1.
- 2.
- 3.
Because of these rules, the joint distribution can be written as
.
To simplify notation, we can define
and
.
Because of the conditional independence property, we can write the joint distribution of all states in the Markov process as
The requirement for
to satisfy
is a very strict requirement. If we wanted to create a “wider” requirement of Markovity, then we could settle for
where
is the best linear estimator of
since this property is satisfied by all Markov triplets, but does not imply a Markov Triplet.
All Wide-Sense Markov models have a very succint representation.
We can think of
as a noisy observation of an underlying Markov Process. The joint distribution of
and
can be written as
Hidden Markov Models can be represented by undirected graphical models. To create an undirected graphical model,
- 1.Create a node for each random variable.
- 2.Draw an edge between two nodes if a factor of the joint distribution contains both nodes.
Undirected graphical models of Hidden Markov Processes are useful because they let us derive additional Markov dependepencies between groups of variables.
Suppose we have a discrete-time random process which evolves in a recursive fashion, meaning the current state depends in some way on the previous state. We can express this recursion with a set of equations.
From Theorem 11, we can easily see that state space models are Wide-Sense Markov. Note that
and
are white noise, and that the dynamics of the system can change at every time step. From these equations, we can derive six different properties. Let
and
and
.
- 1.
- 2.
- 3.
- 4.
- 5.
$$\langle \boldsymbol{X}_i, \boldsymbol{X}_j \rangle = \begin{cases} \Phi_{i,j}\Pi_j & i \geq j \\ \Pi_i \Phi_{j,i}^* & i \leq j \end{cases}$$
- 1.
$$\langle \boldsymbol{Y}_i, \boldsymbol{Y}_j \rangle = \begin{cases} H_i \Phi_{i,j+1}N_j & i > j\\ R_i + H_i\Pi_iH_i^* & i=j \\ N_i^*\Phi^*_{j,i+1}H_j^* & i < j \end{cases} \text{ where } N_i=F_i\Pi_iH_i^*+G_iS_i$$
Last modified 1yr ago