Line 15: | Line 15: | ||
</math></span></font> | </math></span></font> | ||
− | + | <math> | |
− | + | ||
\Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{i\sum_{j=1}^{n}{\omega_jX(t_j+\tau)}} \right ] | \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{i\sum_{j=1}^{n}{\omega_jX(t_j+\tau)}} \right ] | ||
</math> | </math> | ||
Line 26: | Line 25: | ||
− | + | <math> | |
\Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{Y(t_j+\tau)} \right ] = \Phi_{(t_1+\tau)...(t_n+\tau)}(1) | \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{Y(t_j+\tau)} \right ] = \Phi_{(t_1+\tau)...(t_n+\tau)}(1) | ||
</math> | </math> | ||
− | |||
Line 37: | Line 35: | ||
− | + | <math> | |
var(Y(t_j+\tau)) = E \left [(\sum_{j=1}^{n}{w_j(X(t_j+\tau)-\mu)^2} \right ] | var(Y(t_j+\tau)) = E \left [(\sum_{j=1}^{n}{w_j(X(t_j+\tau)-\mu)^2} \right ] | ||
</math> | </math> | ||
− | + | <math> | |
=\sum_{j=1}^{n}{\omega_j^2E \left [ (X(t_j+\tau)-\mu)^2 \right ]} + \sum_{i,j=1}^{n}{\omega_i \omega_j E \left[ (X(t_i+\tau)-\mu)(X(t_j+\tau)-\mu) \right]} | =\sum_{j=1}^{n}{\omega_j^2E \left [ (X(t_j+\tau)-\mu)^2 \right ]} + \sum_{i,j=1}^{n}{\omega_i \omega_j E \left[ (X(t_i+\tau)-\mu)(X(t_j+\tau)-\mu) \right]} | ||
</math> | </math> | ||
− | + | <math> | |
=\sum_{i,j=1}^{n}{\omega_j^2 cov(t_j,t_j)} + \sum_{i,j=1}^{n}{\omega_i \omega_j cov(t_j,t_j)} | =\sum_{i,j=1}^{n}{\omega_j^2 cov(t_j,t_j)} + \sum_{i,j=1}^{n}{\omega_i \omega_j cov(t_j,t_j)} | ||
</math> | </math> | ||
− | |||
Line 56: | Line 53: | ||
\text{Which does not depend on } \tau. | \text{Which does not depend on } \tau. | ||
</math></span></font> | </math></span></font> | ||
− | |||
---- | ---- | ||
Line 64: | Line 60: | ||
<math> \text{Suppose } \mathbf{X}(t) \text{ is a Gaussian Random Process} | <math> \text{Suppose } \mathbf{X}(t) \text{ is a Gaussian Random Process} | ||
</math> | </math> | ||
− | |||
Line 129: | Line 124: | ||
<font face="serif"><span style="font-size: 19px;"><math> | <font face="serif"><span style="font-size: 19px;"><math> | ||
= f(x(t_1),x(t_2),...,x(t_k)) | = f(x(t_1),x(t_2),...,x(t_k)) | ||
+ | </math></span></font> | ||
+ | |||
+ | |||
+ | <font face="serif"><span style="font-size: 19px;"><math> | ||
+ | {\color{red} \text{It is not clear how the student implies that the pdf is not related to } \tau!} | ||
</math></span></font> | </math></span></font> | ||
Revision as of 12:18, 1 August 2012
Contents
ECE Ph.D. Qualifying Exam in "Communication, Networks, Signal, and Image Processing" (CS)
Question 1, August 2011, Part 2
- Part 1,2]
$ \color{blue}\text{Show that if a continuous-time Gaussian random process } \mathbf{X}(t) \text{ is wide-sense stationary, it is also strict-sense stationary.} $
$ \color{blue}\text{Solution 1:} $
$ \mathbf{X}(t) \text{ is SSS if } F_{(t_1+\tau)...(t_n+\tau)}(x_1,...,x_n) \text{ does not depend on } \tau. \text{ To show that, we can show that } \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) \text{ does not depend on } \tau: $
$ \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{i\sum_{j=1}^{n}{\omega_jX(t_j+\tau)}} \right ] $
$ \text{Define } Y(t_j+\tau) = \sum_{j=1}^{n}{\omega_jX(t_j+\tau)} \text{, so} $
$ \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{Y(t_j+\tau)} \right ] = \Phi_{(t_1+\tau)...(t_n+\tau)}(1) $
$ \text{Since } Y(t) \text{ is Gaussian, it is characterized just by its mean and variance. So, we just need to show that mean and variance of } Y(t) \text{ do not depend on } \tau. \text{ Since } Y(t) \text{ is WSS, its mean is constant and does not depend on . For variance} $
$ var(Y(t_j+\tau)) = E \left [(\sum_{j=1}^{n}{w_j(X(t_j+\tau)-\mu)^2} \right ] $
$ =\sum_{j=1}^{n}{\omega_j^2E \left [ (X(t_j+\tau)-\mu)^2 \right ]} + \sum_{i,j=1}^{n}{\omega_i \omega_j E \left[ (X(t_i+\tau)-\mu)(X(t_j+\tau)-\mu) \right]} $
$ =\sum_{i,j=1}^{n}{\omega_j^2 cov(t_j,t_j)} + \sum_{i,j=1}^{n}{\omega_i \omega_j cov(t_j,t_j)} $
$ \text{Which does not depend on } \tau. $
$ \color{blue}\text{Solution 2:} $
$ \text{Suppose } \mathbf{X}(t) \text{ is a Gaussian Random Process} $
$ \Rightarrow f(x(t_1),x(t_2),...,x(t_k)) = \frac{1}{2\pi^{(\frac{k}{2})} |\Sigma |^{\frac{1}{2}}} exp(-\frac{1}{2}(\overrightarrow{x} - \overrightarrow{m})^T \Sigma ^{-1}(\overrightarrow{x} - \overrightarrow{m})) $
$ \text{for any number of time instances.} $
$ \text{If } \mathbf{X}(t) \text{is WSS} $
$ \Rightarrow \text{ (1) } m_X(t_1) = m_X(t_2) = ... = m_X(t_K) = m $
$ \text{ (2) } R_X(t_i,t_i) = R_X(t_i + \tau, t_j + \tau) $
$ \Sigma = \begin{bmatrix} &R_X(t_1,t_1) &... &R_X(t_1,t_k)\\ &\vdots & \\ &R_X(t_k,t_1) &... &R_X(t_k,t_k)\\ \end{bmatrix} $
$ \text{From (1): } \overrightarrow{m}' = (m_X(t_1+\tau) , m_X(t_2+\tau) , ... , m_X(t_K+\tau)) = \overrightarrow{m} $
$ \text{From (2): } \Sigma' = \begin{bmatrix} &R_X(t_1,t_1) &... &R_X(t_1,t_k)\\ &\vdots & \\ &R_X(t_k,t_1) &... &R_X(t_k,t_k)\\ \end{bmatrix} = \Sigma $
$ \text{So } f(x(t_1+\tau),x(t_2+\tau),...,x(t_k+\tau)) \text{ is not related to } \tau. $
$ f(x(t_1+\tau),x(t_2+\tau),...,x(t_k+\tau)) $
$ = f(x(t_1),x(t_2),...,x(t_k)) $
$ {\color{red} \text{It is not clear how the student implies that the pdf is not related to } \tau!} $
$ \Rightarrow \mathbf{X}(t) \text{ is Strict Sense Stationary. } $
"Communication, Networks, Signal, and Image Processing" (CS)- Question 1, August 2011
Go to
- Part 1: solutions and discussions
- Part 2: solutions and discussions