Revision as of 18:16, 2 August 2012 by Sandy (Talk | contribs)

ECE Ph.D. Qualifying Exam in "Communication, Networks, Signal, and Image Processing" (CS)

Question 1, August 2011, Part 2

Part 1,2]

 $ \color{blue}\text{Show that if a continuous-time Gaussian random process } \mathbf{X}(t) \text{ is wide-sense stationary, it is also strict-sense stationary.} $

$ \color{blue}\text{Solution 1:} $

$ {\color{green} \text{Recall:}} $

$ {\color{green} \text{A random process is wide sense stationary (WSS) if}} $

$ {\color{green} i) \text{ its mean is constant.}} $

$ {\color{green} ii) \text{ its correlation only depends on time deference.}} $

$ {\color{green} \text{A random process is Strict Sense Stationary (SSS) if its cdf only depends on time deference.}} $

$ {\color{green} \text{This Also true for the Moment Generating Function of the process, so we can use this function for our proof:}} $


$ \mathbf{X}(t) \text{ is SSS if } F_{(t_1+\tau)...(t_n+\tau)}(x_1,...,x_n) \text{ does not depend on } \tau. \text{ To show that, we can show that } \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) \text{ does not depend on } \tau: $

$ \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{i\sum_{j=1}^{n}{\omega_jX(t_j+\tau)}} \right ] $


$ \text{Define } Y(t_j+\tau) = \sum_{j=1}^{n}{\omega_jX(t_j+\tau)} \text{, so} $


$ \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{Y(t_j+\tau)} \right ] = \Phi_{(t_1+\tau)...(t_n+\tau)}(1) $


$ \text{Since } Y(t) \text{ is Gaussian, it is characterized just by its mean and variance. So, we just need to show that mean and variance of } Y(t) \text{ do not depend on } \tau. \text{ Since } Y(t) \text{ is WSS, its mean is constant and does not depend on . For variance} $


$ var(Y(t_j+\tau)) = E \left [(\sum_{j=1}^{n}{w_j(X(t_j+\tau)-\mu)^2} \right ] $


$ =\sum_{j=1}^{n}{\omega_j^2E \left [ (X(t_j+\tau)-\mu)^2 \right ]} + \sum_{i,j=1}^{n}{\omega_i \omega_j E \left[ (X(t_i+\tau)-\mu)(X(t_j+\tau)-\mu) \right]} $


$ =\sum_{i,j=1}^{n}{\omega_j^2 cov(t_j,t_j)} + \sum_{i,j=1}^{n}{\omega_i \omega_j cov(t_j,t_j)} $


$ \text{Which does not depend on } \tau. $


$ \color{blue}\text{Solution 2:} $

$ \text{Suppose } \mathbf{X}(t) \text{ is a Gaussian Random Process} $


$ \Rightarrow f(x(t_1),x(t_2),...,x(t_k)) = \frac{1}{2\pi^{(\frac{k}{2})} |\Sigma |^{\frac{1}{2}}} exp(-\frac{1}{2}(\overrightarrow{x} - \overrightarrow{m})^T \Sigma ^{-1}(\overrightarrow{x} - \overrightarrow{m})) $


$ \text{for any number of time instances.} $


$ \text{If } \mathbf{X}(t) \text{is WSS} $


$ \Rightarrow \text{ (1) } m_X(t_1) = m_X(t_2) = ... = m_X(t_K) = m $


$ \text{ (2) } R_X(t_i,t_i) = R_X(t_i + \tau, t_j + \tau) $


$ \Sigma = \begin{bmatrix} &R_X(t_1,t_1) &... &R_X(t_1,t_k)\\ &\vdots & \\ &R_X(t_k,t_1) &... &R_X(t_k,t_k)\\ \end{bmatrix} $


$ \text{From (1): } \overrightarrow{m}' = (m_X(t_1+\tau) , m_X(t_2+\tau) , ... , m_X(t_K+\tau)) = \overrightarrow{m} $


$ \text{From (2): } \Sigma' = \begin{bmatrix} &R_X(t_1,t_1) &... &R_X(t_1,t_k)\\ &\vdots & \\ &R_X(t_k,t_1) &... &R_X(t_k,t_k)\\ \end{bmatrix} = \Sigma $


$ {\color{green} \text{It is better to clarify that:} $


$ { \color{green} \text{From (2): } \Sigma' = \begin{bmatrix} &R_X(t_1+\tau,t_1+\tau) &... &R_X(t_1+\tau,t_k+\tau)\\ &\vdots & \\ &R_X(t_k+\tau,t_1+\tau) &... &R_X(t_k+\tau,t_k+\tau)\\ \end{bmatrix} = \begin{bmatrix} &R_X(t_1,t_1) &... &R_X(t_1,t_k)\\ &\vdots & \\ &R_X(t_k,t_1) &... &R_X(t_k,t_k)\\ \end{bmatrix} = \Sigma } $


$ \text{So } f(x(t_1+\tau),x(t_2+\tau),...,x(t_k+\tau)) \text{ is not related to } \tau. $


$ f(x(t_1+\tau),x(t_2+\tau),...,x(t_k+\tau)) $


$ = f(x(t_1),x(t_2),...,x(t_k)) $


$ \Rightarrow \mathbf{X}(t) \text{ is Strict Sense Stationary. } $



"Communication, Networks, Signal, and Image Processing" (CS)- Question 1, August 2011

Go to


Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang