(New page: Category:ECE Category:QE Category:CNSIP Category:problem solving Category:random variables Category:probability <center> <font size= 4> [[ECE_PhD_Qualifying_Exams|...)
 
Line 25: Line 25:
  
  
&nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}\text{State and prove the Chebyshev inequality for random variable} \mathbf{X}\text{ with mean}\mathbf{\mu}\text{ and variance } \mathbf{\sigma^2} \text{. In constructing your proof, keep in mind that} \mathbf{X} \text{ may be either a discrete or continuous random variable} </math></span></font>
+
State and prove the Chebyshev inequality for random variable <math class="inline">\mathbf{X}</math>  with mean <math class="inline">\mu</math> and variance <math class="inline">\sigma^2</math>. In constructing your proof, keep in mind that <math class="inline">\mathbf{X}</math> may be either a discrete or continuous random variable.
  
  
Line 34: Line 34:
  
  
&nbsp;<font color="#ff0000"><span style="font-size: 19px;"><math>\color{blue}  \text{Show that if a continuous-time Gaussian random process } \mathbf{X}(t) \text{ is wide-sense stationary, it is also strict-sense stationary.}
+
Let <math class="inline">\mathbf{X}_{1} \dots \mathbf{X}_{n} \dots </math> be a sequence of independent, identical distributed random variables, each uniformly distributed on the interval [0, 1], an hence having pdf
</math></span></font>  
+
<br>
 +
<math class="inline">f_{X}\left(x\right)=\begin{cases}
 +
\begin{array}{lll}
 +
1,    \text{      for 0 \leq x \leq1\\
 +
0,  \text{     elsewhere. }
 +
\end{array}\end{cases}</math>
 +
<br>
 +
 
 +
Let <math class="inline">\mathbf{Y}_{n}</math> be a new random variable defined by
 +
<br>
 +
<math class="inline">\mathbf{Y}_{n} = min \,\{{ \mathbf{X}_1, \mathbf{X}_2, \dots \mathbf{X}_n} \}</math>
 +
<br>
 +
 
 +
'''(a)''' Find the pdf of <math class="inline">\mathbf{Y}_{n}</math>
 +
 
 +
'''(b)''' Does the sequence <math class="inline">\mathbf{Y}_{n}</math> converge in probability?
 +
 
 +
'''(c)''' Does the sequence <math class="inline">\mathbf{Y}_{n}</math> converge in distribution? If yes, specify the cumulative function of the random variable it converges to.
  
  

Revision as of 15:41, 25 January 2014


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2012



Question

Part 1. 25 pts


State and prove the Chebyshev inequality for random variable $ \mathbf{X} $ with mean $ \mu $ and variance $ \sigma^2 $. In constructing your proof, keep in mind that $ \mathbf{X} $ may be either a discrete or continuous random variable.


Click here to view student answers and discussions

Part 2. 25 pts


Let $ \mathbf{X}_{1} \dots \mathbf{X}_{n} \dots $ be a sequence of independent, identical distributed random variables, each uniformly distributed on the interval [0, 1], an hence having pdf
$ f_{X}\left(x\right)=\begin{cases} \begin{array}{lll} 1, \text{ for } 0 \leq x \leq1\\ 0, \text{ elsewhere. } \end{array}\end{cases} $

Let $ \mathbf{Y}_{n} $ be a new random variable defined by

$ \mathbf{Y}_{n} = min \,\{{ \mathbf{X}_1, \mathbf{X}_2, \dots \mathbf{X}_n} \} $


(a) Find the pdf of $ \mathbf{Y}_{n} $

(b) Does the sequence $ \mathbf{Y}_{n} $ converge in probability?

(c) Does the sequence $ \mathbf{Y}_{n} $ converge in distribution? If yes, specify the cumulative function of the random variable it converges to.


Click here to view student answers and discussions

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Have a piece of advice for Purdue students? Share it through Rhea!

Alumni Liaison