(4 intermediate revisions by 2 users not shown)
Line 3: Line 3:
 
[[Category:CNSIP]]
 
[[Category:CNSIP]]
 
[[Category:problem solving]]
 
[[Category:problem solving]]
[[Category:communication networks signal and image processing]]
+
[[Category:random variables]]
 +
[[Category:probability]]
  
= [[ECE_PhD_Qualifying_Exams|ECE Ph.D. Qualifying Exam]] in Communication Networks Signal and Image processing (CS)Question 1, August 2011=
+
<center>
 +
<font size= 4>
 +
[[ECE_PhD_Qualifying_Exams|ECE Ph.D. Qualifying Exam]]
 +
</font size>
 +
 
 +
<font size= 4>
 +
Communication, Networking, Signal and Image Processing (CS)
 +
 
 +
Question 1: Probability and Random Processes
 +
</font size>
 +
 
 +
August 2011
 +
</center>
 +
----
 
----
 
----
 
==Question==
 
==Question==
Line 42: Line 56:
  
 
:'''Click [[ECE-QE_CS1-2011_solusion-2|here]] to view student [[ECE-QE_CS1-2011_solusion-2|answers and discussions]]'''
 
:'''Click [[ECE-QE_CS1-2011_solusion-2|here]] to view student [[ECE-QE_CS1-2011_solusion-2|answers and discussions]]'''
 +
----
 +
'''Part 3.''' 25 pts
 +
 +
Show that the sum of two jointly distributed Gaussian random variables that are not necessarily statistically independent is a Gaussian random variable.
 +
 +
:'''Click [[ECE-QE_CS1-2011_solusion-3|here]] to view student [[ECE-QE_CS1-2011_solusion-3|answers and discussions]]'''
 +
----
 +
'''Part 4.''' 25 pts
 +
 +
 +
Assume that <math>\mathbf{X}(t)</math> is a zero-mean continuous-time Gaussian white noise process with autocorrelation function
 +
 +
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; <math>R_{\mathbf{XX}}(t_1,t_2)=\delta(t_1-t_2).
 +
</math>
 +
 +
Let <math>\mathbf{Y}(t)</math> be a new random process ontained by passing <math>\mathbf{X}(t)</math> through a linear time-invariant system with impulse response <math>h(t)</math> whose Fourier transform <math>H(\omega)</math> has the ideal low-pass characteristic
 +
 +
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp;<math>H(\omega) =
 +
\begin{cases}
 +
1, & \mbox{if } |\omega|\leq\Omega,\\
 +
0, & \mbox{elsewhere,}
 +
\end{cases}
 +
</math>
 +
 +
where <math>\Omega>0</math>.
 +
 +
a) Find the mean of <math>\mathbf{Y}(t)</math>.
 +
 +
b) Find the autocorrelation function of <math>\mathbf{Y}(t)</math>.
 +
 +
c) Find the joint pdf of <math>\mathbf{Y}(t_1)</math> and <math>\mathbf{Y}(t_2)</math> for any two arbitrary sample time <math>t_1</math> and <math>t_2</math>.
 +
 +
d) What is the minimum time difference <math>t_1-t_2</math> such that <math>\mathbf{Y}(t_1)</math> and <math>\mathbf{Y}(t_2)</math> are statistically independent?
 +
 +
:'''Click [[ECE-QE_CS1-2011_solusion-4|here]] to view student [[ECE-QE_CS1-2011_solusion-4|answers and discussions]]'''
 
----
 
----
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]

Latest revision as of 15:40, 30 March 2015


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2011



Question

Part 1. 25 pts


 $ \color{blue}\text{ Let } \mathbf{X}\text{, }\mathbf{Y}\text{, and } \mathbf{Z} \text{ be three jointly distributed random variables with joint pdf } f_{XYZ}\left ( x,y,z \right )= \frac{3z^{2}}{7\sqrt[]{2\pi}}e^{-zy} exp \left [ -\frac{1}{2}\left ( \frac{x-y}{z}\right )^{2} \right ] \cdot 1_{\left[0,\infty \right )}\left(y \right )\cdot1_{\left[1,2 \right]} \left ( z \right) $

$ \color{blue}\left( \text{a} \right) \text{Find the joint probability density function } f_{YZ}(y,z). $

$ \color{blue}\left( \text{b} \right) \text{Find } f_{x}\left( x|y,z\right ). $

$ \color{blue}\left( \text{c} \right) \text{Find } f_{Z}\left( z\right ). $

$ \color{blue}\left( \text{d} \right) \text{Find } f_{Y}\left(y|z \right ). $

$ \color{blue}\left( \text{e} \right) \text{Find } f_{XY}\left(x,y|z \right ). $


Click here to view student answers and discussions

Part 2. 25 pts


 $ \color{blue} \text{Show that if a continuous-time Gaussian random process } \mathbf{X}(t) \text{ is wide-sense stationary, it is also strict-sense stationary.} $


Click here to view student answers and discussions

Part 3. 25 pts

Show that the sum of two jointly distributed Gaussian random variables that are not necessarily statistically independent is a Gaussian random variable.

Click here to view student answers and discussions

Part 4. 25 pts


Assume that $ \mathbf{X}(t) $ is a zero-mean continuous-time Gaussian white noise process with autocorrelation function

                $ R_{\mathbf{XX}}(t_1,t_2)=\delta(t_1-t_2). $

Let $ \mathbf{Y}(t) $ be a new random process ontained by passing $ \mathbf{X}(t) $ through a linear time-invariant system with impulse response $ h(t) $ whose Fourier transform $ H(\omega) $ has the ideal low-pass characteristic

               $ H(\omega) = \begin{cases} 1, & \mbox{if } |\omega|\leq\Omega,\\ 0, & \mbox{elsewhere,} \end{cases} $

where $ \Omega>0 $.

a) Find the mean of $ \mathbf{Y}(t) $.

b) Find the autocorrelation function of $ \mathbf{Y}(t) $.

c) Find the joint pdf of $ \mathbf{Y}(t_1) $ and $ \mathbf{Y}(t_2) $ for any two arbitrary sample time $ t_1 $ and $ t_2 $.

d) What is the minimum time difference $ t_1-t_2 $ such that $ \mathbf{Y}(t_1) $ and $ \mathbf{Y}(t_2) $ are statistically independent?

Click here to view student answers and discussions

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett