Revision as of 10:47, 30 November 2010 by Nelder (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

3.1 Definitions

Stochastic process

The idea of a stochastic process is a straightforward extension of that of random variable. Instead of mapping each $ \omega\in\mathcal{S} $ of a random experiment to a number $ \mathbf{X}\left(\omega\right) $ , we map it to a function of time $ \mathbf{X}\left(t,\omega\right) $ that is called sample function.

Note

There is nothing random about the sample functions. The randomness comes from the underlying random experiment.

Note

If we pick a particular point in time $ t=t_{1} $ , we have $ \mathbf{X}\left(t_{1},\omega\right) $ that is a random variable.

Definition

A stochastic process or random process defined on $ \left(\mathcal{S},\mathcal{F},\mathcal{P}\right) $ is a famility of random variables

$ \left\{ \mathbf{X}\left(t\right):t\in\mathbf{T}\right\} $

indexed by $ t $ , where the index set $ \mathbf{T} $ can be discrete or continuous.

Note

1. If $ \mathbf{T} $ is an uncountable subset of $ \mathbf{R} $ , $ \mathbf{X}\left(t\right) $ is called a continuous-time random process.

2. If $ \mathbf{T} $ is an discrete set, $ \mathbf{X}\left(t\right) $ is called a discrete-time random process.

3. $ \mathbf{X}\left(t\right) $ is called a discrete-state random process if for all $ t\in\mathbf{T} $ , it takes on values from a discrete set. Otherwise it is called a continuous-state random process.

Note

Thermal noise is continuous-time continuous state random process.

Notation

We will use the notation $ \mathbf{X}\left(t\right) $ to represent a random process (just as we use $ \mathbf{X} $ to represent a random variable). Technically we should write $ \mathbf{X}\left(t,\omega\right) $ or $ \mathbf{X}\left(\cdot,\cdot\right) $ (just as technically we should write $ \mathbf{X}\left(\omega\right) $ or $ \mathbf{X}\left(\cdot\right) $ for a random variable).

$ \mathbf{X}\left(\cdot,\cdot\right):\mathbf{T}\times\mathcal{S}\longrightarrow\mathbf{R}. $

note

$ \mathbf{X}\left(t,\omega\right) $ or $ \mathbf{X}\left(\cdot,\cdot\right) $ is a random process.

$ \mathbf{X}\left(t_{0},\omega\right) $ or $ \mathbf{X}\left(t_{0},\cdot\right) $ is a random variable for fixed </math>t_{0}</math> .

$ \mathbf{X}\left(t,\omega_{0}\right) $ or $ \mathbf{X}\left(\cdot,\omega_{0}\right) $ for a fixed $ \omega_{0}\in\mathcal{S} $ is a function of time or sample function.

$ \mathbf{X}\left(t_{0},\omega_{0}\right) $ is a real number.

3.1.1 Statistics of Stochastic Processes

First-order CDF and PDF

We use CDFs and PDFs to characterize the probalistic behavior of a random process. A random process sampled at a point in time is a random variable. We can consider its CDF or PDF.

Definition

The first-order CDF of a random process $ \mathbf{X}\left(t\right) $ is

$ F_{\mathbf{X}\left(t\right)}\left(x\right)\triangleq P\left(\left\{ X\left(t\right)\leq x\right\} \right)=F_{\mathbf{X}\left(t\right)}\left(x,t\right) $.

The first-order PDF of a random process $ \mathbf{X}\left(t\right) $ is

$ f_{\mathbf{X}\left(t\right)}\left(x\right)=\frac{dF_{\mathbf{X}\left(t\right)}\left(x\right)}{dx}=f_{\mathbf{X}\left(t\right)}\left(x,t\right). $

The n-th order CDF and PDF

We are often interested in the joint behavior of $ \mathbf{X}\left(t\right) $ sample at more than one point in time. First-order description is not sufficient.

Definition

The n-th order CDF of a random process $ \mathbf{X}\left(t\right) $ is

$ F_{\mathbf{X}\left(t_{1}\right),\cdots,\mathbf{X}\left(t_{n}\right)}\left(x_{1},\cdots,x_{n}\right) $

and the n-th order PDF is

$ f_{\mathbf{X}\left(t_{1}\right),\cdots,\mathbf{X}\left(t_{n}\right)}\left(x_{1},\cdots,x_{n}\right)=\frac{\partial^{n}F_{\mathbf{X}\left(t_{1}\right),\cdots,\mathbf{X}\left(t_{n}\right)}\left(x_{1},\cdots,x_{n}\right)}{\partial x_{1}\cdots\partial x_{n}}. $

Of particular interest are the second order $ \left(n=2\right) $ CDFs and PDFs, $ F_{\mathbf{X}\left(t_{1}\right),\mathbf{X}\left(t_{2}\right)}\left(x_{1},x_{2};t_{1},t_{2}\right) $ and $ f_{\mathbf{X}\left(t_{1}\right),\mathbf{X}\left(t_{2}\right)}\left(x_{1},x_{2};t_{1},t_{2}\right) $ .

3.1.2 General Properties

Fact

A complex random process $ \mathbf{Z}\left(t\right)=\mathbf{X}\left(t\right)+i\mathbf{Y}\left(t\right) $ where $ \mathbf{X}\left(t\right) $ and $ \mathbf{Y}\left(t\right) $ are real jointly-distributed random processes, is completely characterized by

$ F_{\mathbf{X}\left(t_{1}\right),\cdots,\mathbf{X}\left(t_{n}\right),\mathbf{Y}\left(t_{1}\right),\cdots,\mathbf{Y}\left(t_{n}\right)}\left(x_{1},\cdots,x_{n},y_{1},\cdots,y_{n},t_{1},\cdots,t_{n}\right) $

for all $ n\in\mathbf{N} $ and all $ t_{1},\cdots,t_{n} $ . (This is a very hard result to proof.)

Fact. The real version

A real random process $ \mathbf{X}\left(t\right) $ is completely chracterized by the joint CDF

$ F_{\mathbf{X}\left(t_{1}\right),\cdots,\mathbf{X}\left(t_{n}\right)}\left(x_{1},\cdots,x_{n};t_{1},\cdots,t_{n}\right) $

for all $ n\in\mathbf{N} $ and all $ t_{1},\cdots,t_{n} $ .

Defintion. Autocorrelation function

For a random process $ \mathbf{X}\left(t\right) $ , real or complex, the autocorrelation function of $ \mathbf{X}\left(t\right) $ is

$ R_{\mathbf{XX}}\left(t_{1},t_{2}\right)\triangleq E\left[\mathbf{X}\left(t_{1}\right)\mathbf{X}^{*}\left(t_{2}\right)\right]. $

Note

$ R_{\mathbf{XX}}\left(t_{1},t_{2}\right) $

Note

For $ \mathbf{X}\left(t\right) $ real

$ R_{\mathbf{XX}}\left(t_{1},t_{2}\right)=E\left[\mathbf{X}\left(t_{1}\right)\mathbf{X}\left(t_{2}\right)\right] $

and

$ R_{\mathbf{XX}}\left(t_{1},t_{2}\right)=R_{\mathbf{XX}}\left(t_{2},t_{1}\right). $

Note

$ E\left[\left|\mathbf{X}\left(t\right)\right|^{2}\right]=E\left[\mathbf{X}\left(t\right)\mathbf{X}^{*}\left(t\right)\right]=R_{\mathbf{XX}}\left(t,t\right). $

$ E\left[\left|\mathbf{X}\left(t\right)\right|^{2}\right]\geq0\Longrightarrow R_{\mathbf{XX}}\left(t,t\right)\geq0,\quad\forall t. $

Definition. Mean

The mean of a random process $ \mathbf{X}\left(t\right) $ is

$ \eta_{\mathbf{X}}\left(t\right)=E\left[\mathbf{X}\left(t\right)\right]. $

If $ \mathbf{X}\left(t\right) $ is a complex random process with $ \mathbf{X}\left(t\right)=\mathbf{X}_{R}\left(t\right)+i\mathbf{X}_{I}\left(t\right) $ , then $ E\left[\mathbf{X}\left(t\right)\right]=E\left[\mathbf{X}\left(t\right)=\mathbf{X}_{R}\left(t\right)+i\mathbf{X}_{I}\left(t\right)\right]=E\left[\mathbf{X}_{R}\left(t\right)\right]+iE\left[\mathbf{X}_{I}\left(t\right)\right]. $

Definition. Autocovariance function

The autocovariance function of a random process $ \mathbf{X}\left(t\right) $ with mean $ \eta_{\mathbf{X}}\left(t\right)=E\left[\mathbf{X}\left(t\right)\right] $ is

$ C_{\mathbf{XX}}\left(t_{1},t_{2}\right) $

Fact

The autocorrelation function $ R_{\mathbf{XX}}\left(t_{1},t_{2}\right) $ is a non-negative definite function: for any set of numbers $ \left\{ a_{i}\right\} $ and times $ \left\{ t_{i}\right\} $ , and any $ n\in\mathbf{N} $

$ \sum_{i=1}^{n}\sum_{j=1}^{n}a_{i}a_{j}^{*}R_{\mathbf{XX}}\left(t_{i},t_{j}\right)\geq0. $

Definition. White noise process

A random process $ \mathbf{W}\left(t\right) $ is a white noise process if $ C_{\mathbf{WW}}\left(t_{1},t_{2}\right)=0 , \forall t_{1}\neq t_{2} $ .

Note

We will see that all non-trivial white noise processes have $ C_{\mathbf{WW}}\left(t_{1},t_{2}\right)=q\left(t_{1}\right)\cdot\delta\left(t_{1}-t_{2}\right) $ .

Definition. Gaussian random process

A random process $ \mathbf{X}\left(t\right) $ is called Gaussian random process if random variables $ \mathbf{X}\left(t_{1}\right),\mathbf{X}\left(t_{2}\right),\cdots,\mathbf{X}\left(t_{n}\right) $ are jointly Gaussian for any $ n\in\mathbf{N} $ and any set of sampling times $ t_{1},t_{2},\cdots,t_{n} $ .

Note

The n-th order characteristic function of a Gaussian random process is $ \Phi_{\mathbf{X}\left(t_{1}\right)\cdots\mathbf{X}\left(t_{n}\right)}\left(\omega_{1},\cdots,\omega_{n}\right)=\exp\left\{ i\sum_{k=1}^{n}\eta_{\mathbf{X}}\left(t_{k}\right)\omega_{k}-\frac{1}{2}\sum_{j=1}^{n}\sum_{k=1}^{n}C_{\mathbf{XX}}\left(t_{j},t_{k}\right)\omega_{j}\omega_{k}\right\} $ .

Important Fact

A Gaussian random process is completely characterized by $ \eta_{\mathbf{X}}\left(t\right)=E\left[\mathbf{X}\left(t\right)\right] $ and $ C_{\mathbf{XX}}\left(t_{1},t_{2}\right)=E\left[\mathbf{X}\left(t_{1}\right)\mathbf{X}^{*}\left(t_{2}\right)\right] $ .

3.1.3 Stationary Processes

Definition. SSS

A random process $ \mathbf{X}\left(t\right) $ is called stationary or strict-sense stationary (SSS) if its probalistic description is invariant to shifts in the time origin:

$ F_{\mathbf{X}\left(t_{1}\right),\cdots,\mathbf{X}\left(t_{n}\right)}\left(x_{1},\cdots,x_{n}\right)=F_{\mathbf{X}\left(t_{1}+c\right),\cdots,\mathbf{X}\left(t_{n}+c\right)}\left(x_{1},\cdots,x_{n}\right) $

for all $ c\in\mathbf{R} $ , for all $ n\in\mathbf{N} $ , and for all $ t_{1},\cdots,t_{n} $ .

Definition. Jointly SSS

Two random processes are jointly SSS if their joint probabilistic description is invariant to shifts in the time origin:

$ F_{\mathbf{X}\left(t_{1}\right),\cdots,\mathbf{X}\left(t_{n}\right),\mathbf{Y}\left(t_{1}\right),\cdots,\mathbf{Y}\left(t_{n}\right)}\left(x_{1},\cdots,x_{n},y_{1},\cdots,y_{n}\right)=F_{\mathbf{X}\left(t_{1}+c\right),\cdots,\mathbf{X}\left(t_{n}+c\right),\mathbf{Y}\left(t_{1}+c\right),\cdots,\mathbf{Y}\left(t_{n}+c\right)}\left(x_{1},\cdots,x_{n},y_{1},\cdots,y_{n}\right). $

for all $ c\in\mathbf{R} $ , for all $ n\in\mathbf{N} $ , and for all $ t_{1},\cdots,t_{n} $ .

Definition

A complex random process $ \mathbf{Z}\left(t\right)=\mathbf{X}\left(t\right)+i\mathbf{Y}\left(t\right) $ where $ \mathbf{X}\left(t\right) $ and $ \mathbf{Y}\left(t\right) $ are real processes is SSS if $ \mathbf{X}\left(t\right) $ and $ \mathbf{Y}\left(t\right) $ are jointly SSS.

Notes on stationary random processes

1. The first-order (n=1 ) cdf or pdf of a stationary random process is independent of time: $ f_{\mathbf{X}\left(t\right)}\left(x\right)=f_{\mathbf{X}\left(t+c\right)}\left(x\right),\quad\forall t\in\mathbf{R}. $

2. The second order (n=2 ) cdf or pdf is a function of time only through the time difference $ \tau_{12}=t_{1}-t_{2} : $ $ f_{\mathbf{X}\left(t_{1}\right)\mathbf{X}\left(t_{2}\right)}\left(x_{1},x_{2}\right)=f_{\mathbf{X}\left(t_{1}+c\right)\mathbf{X}\left(t_{2}+c\right)}\left(x_{1},x_{2}\right)=f\left(x_{1},x_{2},\tau_{12}\right). $

Definition. WSS

• A random process $ \mathbf{X}\left(t\right) $ is called wide sense stationary (WSS) if it satisfies the following two conditions:

1. $ E\left[\mathbf{X}\left(t\right)\right]=\eta_{\mathbf{X}}\left(t\right)=\eta_{\mathbf{X}}\left(\mathbf{\text{constant}}\right) $ .

2. $ E\left[\mathbf{X}\left(t_{1}\right)\mathbf{X}^{*}\left(t_{2}\right)\right]=R_{\mathbf{XX}}\left(t_{1},t_{2}\right)=R_{\mathbf{X}}\left(t_{1}-t_{2}\right)=R_{\mathbf{X}}\left(\tau\right) $ where $ \tau=t_{1}-t_{2} $ .

• For a WSS random process $ \mathbf{X}\left(t\right) $ , $ E\left[\left|\mathbf{X}\left(t\right)\right|^{2}\right]=R_{\mathbf{XX}}\left(t_{1},t_{2}\right)|_{t_{1}=t,t_{2}=t}=R_{\mathbf{XX}}\left(t,t\right)=R_{\mathbf{X}}\left(t-t\right)=R_{\mathbf{X}}\left(0\right). $

• The autocovariance function of a WSS random process $ \mathbf{X}\left(t\right) $ is $ C_{\mathbf{XX}}\left(t_{1},t_{2}\right)=E\left[\left(\mathbf{X}\left(t_{1}\right)-\eta_{\mathbf{X}}\right)\left(\mathbf{X}\left(t_{2}\right)-\eta_{\mathbf{X}}\right)^{*}\right]=R_{\mathbf{X}}\left(t_{1}-t_{2}\right)-\eta_{\mathbf{X}}\eta_{\mathbf{X}}^{*}=R_{\mathbf{X}}\left(\tau\right)-\eta_{\mathbf{X}}\eta_{\mathbf{X}}^{*}=C_{\mathbf{X}}\left(\tau\right) $ where $ \tau=t_{1}-t_{2} $ .

Note

If $ \mathbf{X}\left(t\right) $ is a SSS random process, then it is WSS. The converse is NOT true. The Gaussian random process is an exception.

Theorem

If $ \mathbf{X}\left(t\right) $ is a WSS random process and it is Gaussian, then it is SSS.

Proof

Assume $ \mathbf{X}\left(t\right) $ has mean $ \eta_{\mathbf{X}}\left(t\right)=E\left[\mathbf{X}\left(t\right)\right]=\eta_{\mathbf{X}} $ and covariance $ C_{\mathbf{XX}}\left(t_{1},t_{2}\right)=C_{\mathbf{X}}\left(t_{1}-t_{2}\right) $ . Then the random variables $ \mathbf{X}\left(t_{1}+c\right),\cdots,\mathbf{X}\left(t_{n}+c\right) $ have joint characteristic function.

$ \Phi_{\mathbf{X}\left(t_{1}+c\right)\cdots\mathbf{X}\left(t_{n}+c\right)}\left(\omega_{1},\cdots,\omega_{n}\right)=\exp\left\{ i\sum_{k=1}^{n}\eta_{\mathbf{X}}\left(t_{k}+c\right)\omega_{k}-\frac{1}{2}\sum_{j=1}^{n}\sum_{k=1}^{n}C_{\mathbf{XX}}\left(t_{j}+c,t_{k}+c\right)\omega_{j}\omega_{k}\right\} $$ =\exp\left\{ i\sum_{k=1}^{n}\eta_{\mathbf{X}}\left(t_{k}\right)\omega_{k}-\frac{1}{2}\sum_{j=1}^{n}\sum_{k=1}^{n}C_{\mathbf{XX}}\left(t_{j},t_{k}\right)\omega_{j}\omega_{k}\right\} $$ =\Phi_{\mathbf{X}\left(t_{1}\right)\cdots\mathbf{X}\left(t_{n}\right)}\left(\omega_{1},\cdots,\omega_{n}\right). $

$ \therefore\mathbf{X}\left(t\right) $ is SSS.


Back to ECE600

Back to General Concepts of Stochastic Processes

Alumni Liaison

EISL lab graduate

Mu Qiao