Line 59: Line 59:
 
'''Part 3.''' 25 pts
 
'''Part 3.''' 25 pts
  
 
+
Show that the sum of two jointly distributed Gaussian random variables that are not necessarily statistically independent is a Gaussian random variable.
&nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}  \text{Show that the sum of two jointly distributed Gaussian random variables that are not necessarily statistically independent is a Gaussian random variable.}
+
</math></span></font>
+
 
+
  
 
:'''Click [[ECE-QE_CS1-2011_solusion-3|here]] to view student [[ECE-QE_CS1-2011_solusion-3|answers and discussions]]'''
 
:'''Click [[ECE-QE_CS1-2011_solusion-3|here]] to view student [[ECE-QE_CS1-2011_solusion-3|answers and discussions]]'''
Line 69: Line 66:
  
  
&nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}  \text{Assume that } \mathbf{X}(t) \text{ is a zero-mean continuous-time Gaussian white noise process with autocorrelation function}
+
Assume that <math>\mathbf{X}(t)</math> is a zero-mean continuous-time Gaussian white noise process with autocorrelation function
</math></span></font>
+
  
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}  R_{\mathbf{XX}}(t_1,t_2)=\delta(t_1-t_2).
+
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; <math>R_{\mathbf{XX}}(t_1,t_2)=\delta(t_1-t_2).
</math></span></font>  
+
</math>
  
&nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}  \text{Let } \mathbf{Y}(t) \text{ be a new random process ontained by passing } \mathbf{Y}(t) \text{ through alinear time-invariant system with impulse response } h(t) \text{ whose Fourier transform} H(\omega) \text{ has the ideal low-pass characteristic}
+
Let <math>\mathbf{Y}(t)</math> be a new random process ontained by passing <math>\mathbf{X}(t)</math> through a linear time-invariant system with impulse response <math>h(t)</math> whose Fourier transform <math>H(\omega)</math> has the ideal low-pass characteristic
</math></span></font>
+
  
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}  H(\omega) =
+
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<math>H(\omega) =
 
\begin{cases}  
 
\begin{cases}  
 
1, & \mbox{if } |\omega|<\Omega,\\
 
1, & \mbox{if } |\omega|<\Omega,\\
 
0, & \mbox{elsewhere,}  
 
0, & \mbox{elsewhere,}  
 
\end{cases}
 
\end{cases}
</math></span></font>  
+
</math>
  
&nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}  \text{where } \Omega>0.
+
where <math>\Omega>0</math>.
</math></span></font>
+
  
&nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}  \text{a) Find the mean of } \mathbf{Y}(t).
+
a) Find the mean of <math>\mathbf{Y}(t)</math>.
</math></span></font>
+
  
&nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}  \text{b) Find the autocorrelation function of } \mathbf{Y}(t).
+
b) Find the autocorrelation function of <math>\mathbf{Y}(t)</math>.
</math></span></font>
+
  
&nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}  \text{c) Find the joint pdf of } \mathbf{Y}(t_1) \text{ and } \mathbf{Y}(t_2) \text{ for any two arbitrary sample time } t_1 \text{ and } t_2.
+
c) Find the joint pdf of <math>\mathbf{Y}(t_1)</math> and <math>\mathbf{Y}(t_2)</math> for any two arbitrary sample time <math>t_1</math> and <math>t_2</math>.
</math></span></font>  
+
  
&nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}  \text{d) What is the minimum time difference } t_1-t_2 \text{ such that } \mathbf{Y}(t_1) \text{ and } \mathbf{Y}(t_2) \text{ are statistically independent?}
+
d) What is the minimum time difference <math>t_1-t_2</math> such that <math>\mathbf{Y}(t_1)</math> and <math>\mathbf{Y}(t_2)</math> are statistically independent?
</math></span></font>
+
  
 
:'''Click [[ECE-QE_CS1-2011_solusion-4|here]] to view student [[ECE-QE_CS1-2011_solusion-4|answers and discussions]]'''
 
:'''Click [[ECE-QE_CS1-2011_solusion-4|here]] to view student [[ECE-QE_CS1-2011_solusion-4|answers and discussions]]'''
 
----
 
----
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]

Revision as of 22:06, 29 March 2015


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2011



Question

Part 1. 25 pts


 $ \color{blue}\text{ Let } \mathbf{X}\text{, }\mathbf{Y}\text{, and } \mathbf{Z} \text{ be three jointly distributed random variables with joint pdf } f_{XYZ}\left ( x,y,z \right )= \frac{3z^{2}}{7\sqrt[]{2\pi}}e^{-zy} exp \left [ -\frac{1}{2}\left ( \frac{x-y}{z}\right )^{2} \right ] \cdot 1_{\left[0,\infty \right )}\left(y \right )\cdot1_{\left[1,2 \right]} \left ( z \right) $

$ \color{blue}\left( \text{a} \right) \text{Find the joint probability density function } f_{YZ}(y,z). $

$ \color{blue}\left( \text{b} \right) \text{Find } f_{x}\left( x|y,z\right ). $

$ \color{blue}\left( \text{c} \right) \text{Find } f_{Z}\left( z\right ). $

$ \color{blue}\left( \text{d} \right) \text{Find } f_{Y}\left(y|z \right ). $

$ \color{blue}\left( \text{e} \right) \text{Find } f_{XY}\left(x,y|z \right ). $


Click here to view student answers and discussions

Part 2. 25 pts


 $ \color{blue} \text{Show that if a continuous-time Gaussian random process } \mathbf{X}(t) \text{ is wide-sense stationary, it is also strict-sense stationary.} $


Click here to view student answers and discussions

Part 3. 25 pts

Show that the sum of two jointly distributed Gaussian random variables that are not necessarily statistically independent is a Gaussian random variable.

Click here to view student answers and discussions

Part 4. 25 pts


Assume that $ \mathbf{X}(t) $ is a zero-mean continuous-time Gaussian white noise process with autocorrelation function

                $ R_{\mathbf{XX}}(t_1,t_2)=\delta(t_1-t_2). $

Let $ \mathbf{Y}(t) $ be a new random process ontained by passing $ \mathbf{X}(t) $ through a linear time-invariant system with impulse response $ h(t) $ whose Fourier transform $ H(\omega) $ has the ideal low-pass characteristic

               $ H(\omega) = \begin{cases} 1, & \mbox{if } |\omega|<\Omega,\\ 0, & \mbox{elsewhere,} \end{cases} $

where $ \Omega>0 $.

a) Find the mean of $ \mathbf{Y}(t) $.

b) Find the autocorrelation function of $ \mathbf{Y}(t) $.

c) Find the joint pdf of $ \mathbf{Y}(t_1) $ and $ \mathbf{Y}(t_2) $ for any two arbitrary sample time $ t_1 $ and $ t_2 $.

d) What is the minimum time difference $ t_1-t_2 $ such that $ \mathbf{Y}(t_1) $ and $ \mathbf{Y}(t_2) $ are statistically independent?

Click here to view student answers and discussions

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett