Revision as of 19:09, 30 November 2013 by Mhossain (Talk | contribs)

Back to all ECE 600 notes


Random Variables and Signals

Topic 19: Stochastic Processes



Stochastic Processes

We have already seen discrete-time random processes, but we will now formalize the concept of random process, including both discrete-time and continuous time.

'Definition $ \qquad $ a stochastic process, or random process, defines on (S,F,P) is a family of random variables {X(t), t ∈ T} indexed by a set T.


Fig 1: The mapping from the sample space to the reals under X$ _j $.


Each waveform is referred to as a sample realization. Note that T can be uncountable, as shown above, or countable.

Note that

  • X(t,$ \omega $) (or simply X(t)) is a random process.
  • X(t$ _0 $,$ \omega $) is a random variable for fixed t$ _0 $.
  • X(t,$ \omega_0 $) is a real-valued function of t for fixed $ \omega_0 $.
  • X(t$ _0 $,$ \omega_0 $) is a real number for fixed t$ _0<math> and <math>\omega_0 $.

There are four types or random processes we will consider

  1. T ⊂ R uncountable, X(t) a discrete random variable for every t ∈ T is a continuous-time discrete random process.
  2. T ⊂ R uncountable, X(t) a continuous random variable for every t ∈ T is a continuous time continuous random process.
  3. T ⊂ R countable, X(t) a discrete random variable for every t ∈ T is a discrete-time discrete random process.
  4. T ⊂ R countable, X(t) a continuous random variable for every t ∈ T is a discrete-time continuous random process.

Example $ \qquad $ if T = N = {1,2,3,...}, then X(t) is a discrete time random process, usually written as X$ _1 $,X$ _2 $

Example $ \qquad $ a binary waveform with random transition times


Fig 2: A binary waveform with random transition times.


Example $ \qquad $ A sinusoid with random frequency

$ X(t)=\sin(\Omega t) $

where $ \Omega $ is a random variable.



Probabilistic Description of a Random Process

We can use joint pdfs of pmfs, but often we use the first and second order moments instead.

Definition $ \qquad $ The nth order cdf of X(t) is

$ F_{X(t_1)...X(t_n)}(x_1,...,x_n)\equiv P(X(t_1)\leq x_1,...,X(t_n)\leq x_n) $

and the nth order pdf is

$ f_{X(t_1)...X(t_n)}(x_1,...,x_n)=\frac{\partial F_{X(t_1)...X(t_n)}(x_1,...,x_n)}{\partial x_1...\partial x_n} $

Notation $ \qquad $ for n=1, we have

$ f_{X(t_1)}(x_1)=f_{X_1}(x_1) $

and for n= 2,

$ f_{X(t_1)X(t_2)}(x_1, x_2)=f_{X_1X_2}(x_1,x_2) $

Definition $ \qquad $ The nth order pmf of a discrete random process is

$ p_{X(t_1)...X(t_n)}(x_1,...,x_n)=P(X(t_1)=x_1,...,X(t_n)=x_n) $

It can be shown that if f$ _{X(t1)...X(tn)} $(x$ _1 $,...x$ _n $) is specified ∀t$ _1 $,...,t$ _n $; ∀n = 1,2,..., then X(t) is a valid random process consistent with a probability space (S,F,P). This result comes from the Kolmogorov existence theorem, which we will not cover.

Now consider the first and second order moments for a random process.

Definition $ \qquad $ The mean of a random process X(t) is

$ \mu_X(t)\equiv E[X(t)]\quad\forall t\in T $

Definition $ \qquad $ The autocorrelation function of a random process X(t) is

$ R_{XX}(t_1,t_2)\equiv E[X(t_1)X(t_2)] $

Note: R$ _{XX} $(t$ _1 $,t$ _2 $) = R$ _{XX} $(t$ _2 $,t$ _1 $)

Definition $ \qquad $ The autocovariance function of a random process X(t) is

$ \begin{align} C_{XX}(t_1,t_2)&\equiv E[(X(t_1)-\mu_X(t_1))(X(t_2)-\mu_X(t_2))] \\ &=R_{XX}(t_1,t_2)-\mu_X(t_1)\mu_X(t_2) \end{align} $


Important property of R$ _{XX} $ and C$ _{XX} $:
R$ _{XX} $ and C$ _{XX} $ are non-negative definite functions, i.e., ∀a$ _1 $,...,a$ _n $R and t$ _1 $,...,t$ _n $R, and ∀n ∈ N,

$ \sum_{i=1}^n\sum_{j=1}^na_ia_jR_{XX}(t_1,t_j)\geq 0 $

Proof $ \qquad $ See the proof of NND property of correlation matrix R$ _X $. Let R$ _{ij} $ = R$ _{XX} $(t$ _i $, t$ _j $).


Two important properties of random processes:

Definition $ \qquad $ A random process W(t) is called a white noise process if C$ _{WW} $(t$ _1 $,t$ _2 $) = 0 ∀t$ _1 $ ≠ t$ _2 $.

This means that ∀t$ _1 $ ≠ t$ _2 $, W(t$ _1 $) and W(t$ _2 $) are uncorrelated.

Definition $ \qquad $ A random process X(t) is called a Gaussian random process if X(t$ _1 $),...,X(t$ _n $) are jointly Gaussian random variables ∀t$ _1 $,...,t$ _n $ for any n ∈ N.


The nth order characteristic function of a Gaussian random process is given by

$ \Phi_{X(t_1)...X(t_n)}(\omega_1,...,\omega_n) = e^{ i\sum_{k=1}^n \mu_X(t_k)\omega_k - \frac{1}{2} \sum_{j=1}^n \sum_{k=1}^n C_{XX}(t_j,t_k)\omega_j\omega_k} $



Stationarity

Intuitive idea: A random process is stationary (is some sense) if its probabilistic description (nth order cdf/pdf/pmf, or mean, autocorrelation, autocovariance functions) does not depend on the time origin.


Fig 3

Does the nth order cdf/pdf/pmf depend on where t=0 is? Do $ \mu_X $(t), R$ _{XX} $(t$ _1 $,t$ _2 $), C$ _{XX} $(t$ _1 $,t$ _2 $)?

Defintion $ \qquad $ a random process X(tO is stict sense stationary (SSS), or simply stationary, if

$ F_{X(t_1)...X(t_n)}(x_1,...,x_n)=F_{X(t_1+\alpha)...X(t_n+\alpha)}(x_1,...,x_n) $
$ \forall\alpha\in\mathbb R,\;n\in\mathbb N,\;t_1,...,t_n\in\mathbb R $

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett