(9 intermediate revisions by one other user not shown)
Line 1: Line 1:
= [[ECE PhD Qualifying Exams|ECE Ph.D. Qualifying Exam]] in "Communication, Networks, Signal, and Image Processing" (CS)  =
+
[[Category:ECE]]
 +
[[Category:QE]]
 +
[[Category:CNSIP]]
 +
[[Category:problem solving]]
 +
[[Category:random variables]]
 +
[[Category:probability]]
  
= [[ECE-QE_CS1-2011|Question 1, August 2011]], Part 2 =
+
<center>
 +
<font size= 4>
 +
[[ECE_PhD_Qualifying_Exams|ECE Ph.D. Qualifying Exam]]
 +
</font size>
  
:[[ECE-QE_CS1-2011_solusion-1|Part 1]],[[ECE-QE CS1-2011 solusion-2|2]]]
+
<font size= 4>
 +
Communication, Networking, Signal and Image Processing (CS)
  
----
+
Question 1: Probability and Random Processes
 +
</font size>
  
 +
August 2011
 +
</center>
 +
----
 +
----
 +
=Part 2 =
 +
Jump to [[ECE-QE_CS1-2011_solusion-1|Part 1]],[[ECE-QE CS1-2011 solusion-2|2]]
 +
----
 
&nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}\text{Show that if a continuous-time Gaussian random process } \mathbf{X}(t) \text{ is wide-sense stationary, it is also strict-sense stationary.}
 
&nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}\text{Show that if a continuous-time Gaussian random process } \mathbf{X}(t) \text{ is wide-sense stationary, it is also strict-sense stationary.}
 
</math></span></font>
 
</math></span></font>
  
 
===== <math>\color{blue}\text{Solution 1:}</math>  =====
 
===== <math>\color{blue}\text{Solution 1:}</math>  =====
 +
<math>
 +
{\color{green} \text{Recall Should be added:}}
 +
</math>
 +
 +
<math>
 +
{\color{green} \text{A random process is wide sense stationary (WSS) if}}
 +
</math>
 +
 +
<math>
 +
{\color{green} i) \text{ its mean is constant.}}
 +
</math>
 +
 +
<math>
 +
{\color{green} ii) \text{ its correlation only depends on time deference.}}
 +
</math>
 +
 +
<math>
 +
{\color{green} \text{A random process is Strict Sense Stationary (SSS) if its cdf only depends on time deference.}}
 +
</math>
 +
 +
<math>
 +
{\color{green} \text{This Also true for the Moment Generating Function of the process, so we can use this function for our proof:}}
 +
</math>
 +
 +
 
<font face="serif"><span style="font-size: 19px;"><math>
 
<font face="serif"><span style="font-size: 19px;"><math>
 
\mathbf{X}(t) \text{ is SSS if } F_{(t_1+\tau)...(t_n+\tau)}(x_1,...,x_n) \text{ does not depend on } \tau. \text{ To show that, we can show that } \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n)  \text{ does not depend on } \tau:
 
\mathbf{X}(t) \text{ is SSS if } F_{(t_1+\tau)...(t_n+\tau)}(x_1,...,x_n) \text{ does not depend on } \tau. \text{ To show that, we can show that } \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n)  \text{ does not depend on } \tau:
 
</math></span></font>
 
</math></span></font>
  
 
+
<math>
&nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<math>
+
 
\Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{i\sum_{j=1}^{n}{\omega_jX(t_j+\tau)}}  \right ]
 
\Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{i\sum_{j=1}^{n}{\omega_jX(t_j+\tau)}}  \right ]
 
</math>
 
</math>
Line 26: Line 67:
  
  
&nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<math>
+
<math>
 
\Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{Y(t_j+\tau)} \right ] = \Phi_{(t_1+\tau)...(t_n+\tau)}(1)
 
\Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{Y(t_j+\tau)} \right ] = \Phi_{(t_1+\tau)...(t_n+\tau)}(1)
 
</math>
 
</math>
 
  
  
Line 37: Line 77:
  
  
&nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<math>  
+
<math>  
 
var(Y(t_j+\tau)) = E \left [(\sum_{j=1}^{n}{w_j(X(t_j+\tau)-\mu)^2}  \right ]
 
var(Y(t_j+\tau)) = E \left [(\sum_{j=1}^{n}{w_j(X(t_j+\tau)-\mu)^2}  \right ]
 
</math>
 
</math>
  
  
&nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<math>  
+
<math>  
 
=\sum_{j=1}^{n}{\omega_j^2E \left [ (X(t_j+\tau)-\mu)^2 \right ]} + \sum_{i,j=1}^{n}{\omega_i \omega_j E \left[ (X(t_i+\tau)-\mu)(X(t_j+\tau)-\mu) \right]}  
 
=\sum_{j=1}^{n}{\omega_j^2E \left [ (X(t_j+\tau)-\mu)^2 \right ]} + \sum_{i,j=1}^{n}{\omega_i \omega_j E \left[ (X(t_i+\tau)-\mu)(X(t_j+\tau)-\mu) \right]}  
 
</math>
 
</math>
  
  
&nbsp;&nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<math>  
+
<math>  
 
=\sum_{i,j=1}^{n}{\omega_j^2 cov(t_j,t_j)}  + \sum_{i,j=1}^{n}{\omega_i \omega_j cov(t_j,t_j)}  
 
=\sum_{i,j=1}^{n}{\omega_j^2 cov(t_j,t_j)}  + \sum_{i,j=1}^{n}{\omega_i \omega_j cov(t_j,t_j)}  
 
</math>
 
</math>
 
  
  
Line 56: Line 95:
 
\text{Which does not depend on } \tau.
 
\text{Which does not depend on } \tau.
 
</math></span></font>
 
</math></span></font>
 
  
 
----
 
----
Line 66: Line 104:
  
  
<math>  
+
<font face="serif"><span style="font-size: 19px;"><math>  
 
\Rightarrow f(x(t_1),x(t_2),...,x(t_k)) = \frac{1}{2\pi^{(\frac{k}{2})} |\Sigma |^{\frac{1}{2}}} exp(-\frac{1}{2}(\overrightarrow{x} - \overrightarrow{m})^T \Sigma ^{-1}(\overrightarrow{x} - \overrightarrow{m}))
 
\Rightarrow f(x(t_1),x(t_2),...,x(t_k)) = \frac{1}{2\pi^{(\frac{k}{2})} |\Sigma |^{\frac{1}{2}}} exp(-\frac{1}{2}(\overrightarrow{x} - \overrightarrow{m})^T \Sigma ^{-1}(\overrightarrow{x} - \overrightarrow{m}))
</math>
+
</math></span></font>
  
  
<math>  
+
 
 +
<font face="serif"><span style="font-size: 19px;"><math>  
 
\text{for any number of time instances.}
 
\text{for any number of time instances.}
</math>
+
</math></span></font>
  
  
Line 81: Line 120:
  
  
<math>  
+
<font face="serif"><span style="font-size: 19px;"><math>  
 
\Rightarrow \text{ (1) } m_X(t_1) = m_X(t_2) = ... = m_X(t_K) = m
 
\Rightarrow \text{ (1) } m_X(t_1) = m_X(t_2) = ... = m_X(t_K) = m
</math>
+
</math></span></font>
  
  
<math>  
+
<font face="serif"><span style="font-size: 19px;"><math>  
 
\text{ (2) } R_X(t_i,t_i) = R_X(t_i + \tau, t_j + \tau)
 
\text{ (2) } R_X(t_i,t_i) = R_X(t_i + \tau, t_j + \tau)
</math>
+
</math></span></font>
  
  
Line 100: Line 139:
  
  
<math>
+
<font face="serif"><span style="font-size: 19px;"><math>
 
\text{From (1): } \overrightarrow{m}' = (m_X(t_1+\tau) , m_X(t_2+\tau) , ... , m_X(t_K+\tau)) = \overrightarrow{m}
 
\text{From (1): } \overrightarrow{m}' = (m_X(t_1+\tau) , m_X(t_2+\tau) , ... , m_X(t_K+\tau)) = \overrightarrow{m}
</math>
+
</math></span></font>
  
  
Line 115: Line 154:
  
  
<math>
+
 
 +
<font face="serif"><span style="font-size: 19px;"><math>
 +
{\color{green} \text{Should be clarified that:}}
 +
</math></span></font>
 +
 
 +
 
 +
<math>{ \color{green}
 +
\text{From (2): }
 +
\Sigma' = \begin{bmatrix}
 +
&R_X(t_1+\tau,t_1+\tau)  &... &R_X(t_1+\tau,t_k+\tau)\\
 +
&\vdots        &                \\
 +
&R_X(t_k+\tau,t_1+\tau)  &... &R_X(t_k+\tau,t_k+\tau)\\
 +
\end{bmatrix} = \begin{bmatrix}
 +
&R_X(t_1,t_1)  &... &R_X(t_1,t_k)\\
 +
&\vdots        &                \\
 +
&R_X(t_k,t_1)  &... &R_X(t_k,t_k)\\
 +
\end{bmatrix} = \Sigma
 +
}</math>
 +
 
 +
 
 +
<font face="serif"><span style="font-size: 19px;"><math>
 
\text{So }  f(x(t_1+\tau),x(t_2+\tau),...,x(t_k+\tau)) \text{ is not related to } \tau.
 
\text{So }  f(x(t_1+\tau),x(t_2+\tau),...,x(t_k+\tau)) \text{ is not related to } \tau.
</math>
+
</math></span></font>
  
  
<math>
+
<font face="serif"><span style="font-size: 19px;"><math>
 
f(x(t_1+\tau),x(t_2+\tau),...,x(t_k+\tau))
 
f(x(t_1+\tau),x(t_2+\tau),...,x(t_k+\tau))
</math>
+
</math></span></font>
  
  
<math>
+
<font face="serif"><span style="font-size: 19px;"><math>
 
= f(x(t_1),x(t_2),...,x(t_k))
 
= f(x(t_1),x(t_2),...,x(t_k))
</math>
+
</math></span></font>
  
  
Line 133: Line 192:
 
\Rightarrow \mathbf{X}(t) \text{ is Strict Sense Stationary. }
 
\Rightarrow \mathbf{X}(t) \text{ is Strict Sense Stationary. }
 
</math>
 
</math>
 +
  
 
----
 
----

Latest revision as of 09:30, 13 September 2013


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2011



Part 2

Jump to Part 1,2


 $ \color{blue}\text{Show that if a continuous-time Gaussian random process } \mathbf{X}(t) \text{ is wide-sense stationary, it is also strict-sense stationary.} $

$ \color{blue}\text{Solution 1:} $

$ {\color{green} \text{Recall Should be added:}} $

$ {\color{green} \text{A random process is wide sense stationary (WSS) if}} $

$ {\color{green} i) \text{ its mean is constant.}} $

$ {\color{green} ii) \text{ its correlation only depends on time deference.}} $

$ {\color{green} \text{A random process is Strict Sense Stationary (SSS) if its cdf only depends on time deference.}} $

$ {\color{green} \text{This Also true for the Moment Generating Function of the process, so we can use this function for our proof:}} $


$ \mathbf{X}(t) \text{ is SSS if } F_{(t_1+\tau)...(t_n+\tau)}(x_1,...,x_n) \text{ does not depend on } \tau. \text{ To show that, we can show that } \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) \text{ does not depend on } \tau: $

$ \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{i\sum_{j=1}^{n}{\omega_jX(t_j+\tau)}} \right ] $


$ \text{Define } Y(t_j+\tau) = \sum_{j=1}^{n}{\omega_jX(t_j+\tau)} \text{, so} $


$ \Phi_{(t_1+\tau)...(t_n+\tau)}(\omega_1,...,\omega_n) = E \left [e^{Y(t_j+\tau)} \right ] = \Phi_{(t_1+\tau)...(t_n+\tau)}(1) $


$ \text{Since } Y(t) \text{ is Gaussian, it is characterized just by its mean and variance. So, we just need to show that mean and variance of } Y(t) \text{ do not depend on } \tau. \text{ Since } Y(t) \text{ is WSS, its mean is constant and does not depend on . For variance} $


$ var(Y(t_j+\tau)) = E \left [(\sum_{j=1}^{n}{w_j(X(t_j+\tau)-\mu)^2} \right ] $


$ =\sum_{j=1}^{n}{\omega_j^2E \left [ (X(t_j+\tau)-\mu)^2 \right ]} + \sum_{i,j=1}^{n}{\omega_i \omega_j E \left[ (X(t_i+\tau)-\mu)(X(t_j+\tau)-\mu) \right]} $


$ =\sum_{i,j=1}^{n}{\omega_j^2 cov(t_j,t_j)} + \sum_{i,j=1}^{n}{\omega_i \omega_j cov(t_j,t_j)} $


$ \text{Which does not depend on } \tau. $


$ \color{blue}\text{Solution 2:} $

$ \text{Suppose } \mathbf{X}(t) \text{ is a Gaussian Random Process} $


$ \Rightarrow f(x(t_1),x(t_2),...,x(t_k)) = \frac{1}{2\pi^{(\frac{k}{2})} |\Sigma |^{\frac{1}{2}}} exp(-\frac{1}{2}(\overrightarrow{x} - \overrightarrow{m})^T \Sigma ^{-1}(\overrightarrow{x} - \overrightarrow{m})) $


$ \text{for any number of time instances.} $


$ \text{If } \mathbf{X}(t) \text{is WSS} $


$ \Rightarrow \text{ (1) } m_X(t_1) = m_X(t_2) = ... = m_X(t_K) = m $


$ \text{ (2) } R_X(t_i,t_i) = R_X(t_i + \tau, t_j + \tau) $


$ \Sigma = \begin{bmatrix} &R_X(t_1,t_1) &... &R_X(t_1,t_k)\\ &\vdots & \\ &R_X(t_k,t_1) &... &R_X(t_k,t_k)\\ \end{bmatrix} $


$ \text{From (1): } \overrightarrow{m}' = (m_X(t_1+\tau) , m_X(t_2+\tau) , ... , m_X(t_K+\tau)) = \overrightarrow{m} $


$ \text{From (2): } \Sigma' = \begin{bmatrix} &R_X(t_1,t_1) &... &R_X(t_1,t_k)\\ &\vdots & \\ &R_X(t_k,t_1) &... &R_X(t_k,t_k)\\ \end{bmatrix} = \Sigma $


$ {\color{green} \text{Should be clarified that:}} $


$ { \color{green} \text{From (2): } \Sigma' = \begin{bmatrix} &R_X(t_1+\tau,t_1+\tau) &... &R_X(t_1+\tau,t_k+\tau)\\ &\vdots & \\ &R_X(t_k+\tau,t_1+\tau) &... &R_X(t_k+\tau,t_k+\tau)\\ \end{bmatrix} = \begin{bmatrix} &R_X(t_1,t_1) &... &R_X(t_1,t_k)\\ &\vdots & \\ &R_X(t_k,t_1) &... &R_X(t_k,t_k)\\ \end{bmatrix} = \Sigma } $


$ \text{So } f(x(t_1+\tau),x(t_2+\tau),...,x(t_k+\tau)) \text{ is not related to } \tau. $


$ f(x(t_1+\tau),x(t_2+\tau),...,x(t_k+\tau)) $


$ = f(x(t_1),x(t_2),...,x(t_k)) $


$ \Rightarrow \mathbf{X}(t) \text{ is Strict Sense Stationary. } $



"Communication, Networks, Signal, and Image Processing" (CS)- Question 1, August 2011

Go to


Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Abstract algebra continues the conceptual developments of linear algebra, on an even grander scale.

Dr. Paul Garrett