(2 intermediate revisions by the same user not shown)
Line 21: Line 21:
 
----
 
----
 
==Question==
 
==Question==
'''Part 1. '''
 
  
1. (20 pts)
+
'''1. (20 pts)'''
  
 
Given two coins; the first coin is fair and the second coin has two heads. One coin is picked at random and tossed two times. It shows heads both times. What is the probability that the coin picked is fair?
 
Given two coins; the first coin is fair and the second coin has two heads. One coin is picked at random and tossed two times. It shows heads both times. What is the probability that the coin picked is fair?
Line 29: Line 28:
 
:'''Click [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.1|here]] to view student [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.1|answers and discussions]]'''
 
:'''Click [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.1|here]] to view student [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.1|answers and discussions]]'''
 
----
 
----
'''Part 2.'''
 
  
2. (20 pts)
+
'''2. (20 pts)'''
  
 
Let <math class="inline">\mathbf{X}_{t}</math>  and <math class="inline">\mathbf{Y}_{t}</math>  by jointly wide sense stationary continous parameter random processes with <math class="inline">E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]=0</math> . Show that <math class="inline">R_{\mathbf{X}}\left(\tau\right)=R_{\mathbf{Y}}\left(\tau\right)=R_{\mathbf{XY}}\left(\tau\right)</math> .
 
Let <math class="inline">\mathbf{X}_{t}</math>  and <math class="inline">\mathbf{Y}_{t}</math>  by jointly wide sense stationary continous parameter random processes with <math class="inline">E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]=0</math> . Show that <math class="inline">R_{\mathbf{X}}\left(\tau\right)=R_{\mathbf{Y}}\left(\tau\right)=R_{\mathbf{XY}}\left(\tau\right)</math> .
Line 37: Line 35:
 
:'''Click [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.2|here]] to view student [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.2|answers and discussions]]'''
 
:'''Click [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.2|here]] to view student [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.2|answers and discussions]]'''
 
----
 
----
'''Part 3.'''
 
  
3. (20 pts)
+
'''3. (20 pts)'''
  
 
Let <math class="inline">\mathbf{X}_{t}</math>  be a zero mean continuous parameter random process. Let <math class="inline">g(t)</math>  and <math class="inline">w\left(t\right)</math>  be measurable functions defined on the real numbers. Further, let <math class="inline">w\left(t\right)</math>  be even. Let the autocorrelation function of <math class="inline">\mathbf{X}_{t}</math>  be <math class="inline">\frac{g\left(t_{1}\right)g\left(t_{2}\right)}{w\left(t_{1}-t_{2}\right)}</math> . From the new random process <math class="inline">\mathbf{Y}_{i}=\frac{\mathbf{X}\left(t\right)}{g\left(t\right)}</math> . Is <math class="inline">\mathbf{Y}_{t}</math>  w.s.s. ?
 
Let <math class="inline">\mathbf{X}_{t}</math>  be a zero mean continuous parameter random process. Let <math class="inline">g(t)</math>  and <math class="inline">w\left(t\right)</math>  be measurable functions defined on the real numbers. Further, let <math class="inline">w\left(t\right)</math>  be even. Let the autocorrelation function of <math class="inline">\mathbf{X}_{t}</math>  be <math class="inline">\frac{g\left(t_{1}\right)g\left(t_{2}\right)}{w\left(t_{1}-t_{2}\right)}</math> . From the new random process <math class="inline">\mathbf{Y}_{i}=\frac{\mathbf{X}\left(t\right)}{g\left(t\right)}</math> . Is <math class="inline">\mathbf{Y}_{t}</math>  w.s.s. ?
Line 45: Line 42:
 
:'''Click [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.3|here]] to view student [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.3|answers and discussions]]'''
 
:'''Click [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.3|here]] to view student [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.3|answers and discussions]]'''
 
----
 
----
'''Part 4.'''
 
  
4. (20 pts)
+
'''4. (20 pts)'''
  
 
Let <math class="inline">\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math>  be i.i.d.  random variables with absolutely continuous probability distribution function <math class="inline">F\left(x\right)</math> . Let the random variable <math class="inline">\mathbf{Y}_{j}</math>  be the <math class="inline">j</math> -th order statistic of the <math class="inline">\mathbf{X}_{i}</math> 's. that is: <math class="inline">\mathbf{Y}_{j}=j\text{-th smallest of }\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\}</math> .  
 
Let <math class="inline">\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math>  be i.i.d.  random variables with absolutely continuous probability distribution function <math class="inline">F\left(x\right)</math> . Let the random variable <math class="inline">\mathbf{Y}_{j}</math>  be the <math class="inline">j</math> -th order statistic of the <math class="inline">\mathbf{X}_{i}</math> 's. that is: <math class="inline">\mathbf{Y}_{j}=j\text{-th smallest of }\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\}</math> .  
Line 66: Line 62:
  
 
----
 
----
'''Part 5.'''
 
  
5. (20 pts)
+
'''5. (20 pts)'''
  
 
Let <math class="inline">\mathbf{X}</math>  be a random variable with absolutely continuous probability distribution function. Show that for any <math class="inline">\alpha>0</math>  and any real number <math class="inline">s</math> :<math class="inline">P\left(e^{s\mathbf{X}}\geq\alpha\right)\leq\frac{\phi\left(s\right)}{\alpha}</math> where <math class="inline">\phi\left(s\right)</math>  is the moment generating function, <math class="inline">\phi\left(s\right)=E\left[e^{s\mathbf{X}}\right]</math> . Note: <math class="inline">\phi\left(s\right)</math>  can be related to the Laplace Transform of <math class="inline">f_{\mathbf{X}}\left(x\right)</math> .
 
Let <math class="inline">\mathbf{X}</math>  be a random variable with absolutely continuous probability distribution function. Show that for any <math class="inline">\alpha>0</math>  and any real number <math class="inline">s</math> :<math class="inline">P\left(e^{s\mathbf{X}}\geq\alpha\right)\leq\frac{\phi\left(s\right)}{\alpha}</math> where <math class="inline">\phi\left(s\right)</math>  is the moment generating function, <math class="inline">\phi\left(s\right)=E\left[e^{s\mathbf{X}}\right]</math> . Note: <math class="inline">\phi\left(s\right)</math>  can be related to the Laplace Transform of <math class="inline">f_{\mathbf{X}}\left(x\right)</math> .
  
:'''Click [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.4|here]] to view student [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.4|answers and discussions]]'''
+
:'''Click [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.5|here]] to view student [[ECE_PhD_QE_CNSIP_Jan_2002_Problem1.5|answers and discussions]]'''
 
----
 
----
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]

Latest revision as of 10:16, 10 March 2015


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

January 2002



Question

1. (20 pts)

Given two coins; the first coin is fair and the second coin has two heads. One coin is picked at random and tossed two times. It shows heads both times. What is the probability that the coin picked is fair?

Click here to view student answers and discussions

2. (20 pts)

Let $ \mathbf{X}_{t} $ and $ \mathbf{Y}_{t} $ by jointly wide sense stationary continous parameter random processes with $ E\left[\left|\mathbf{X}\left(0\right)-\mathbf{Y}\left(0\right)\right|^{2}\right]=0 $ . Show that $ R_{\mathbf{X}}\left(\tau\right)=R_{\mathbf{Y}}\left(\tau\right)=R_{\mathbf{XY}}\left(\tau\right) $ .

Click here to view student answers and discussions

3. (20 pts)

Let $ \mathbf{X}_{t} $ be a zero mean continuous parameter random process. Let $ g(t) $ and $ w\left(t\right) $ be measurable functions defined on the real numbers. Further, let $ w\left(t\right) $ be even. Let the autocorrelation function of $ \mathbf{X}_{t} $ be $ \frac{g\left(t_{1}\right)g\left(t_{2}\right)}{w\left(t_{1}-t_{2}\right)} $ . From the new random process $ \mathbf{Y}_{i}=\frac{\mathbf{X}\left(t\right)}{g\left(t\right)} $ . Is $ \mathbf{Y}_{t} $ w.s.s. ?

Click here to view student answers and discussions

4. (20 pts)

Let $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n} $ be i.i.d. random variables with absolutely continuous probability distribution function $ F\left(x\right) $ . Let the random variable $ \mathbf{Y}_{j} $ be the $ j $ -th order statistic of the $ \mathbf{X}_{i} $ 's. that is: $ \mathbf{Y}_{j}=j\text{-th smallest of }\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\} $ .

(a)

What is another name for the first order statistic?

(b)

What is another name for the n/2 order statistic?

(c)

Find the probability density function of the first order statistic. (You may assume n is odd.)

Click here to view student answers and discussions

5. (20 pts)

Let $ \mathbf{X} $ be a random variable with absolutely continuous probability distribution function. Show that for any $ \alpha>0 $ and any real number $ s $ :$ P\left(e^{s\mathbf{X}}\geq\alpha\right)\leq\frac{\phi\left(s\right)}{\alpha} $ where $ \phi\left(s\right) $ is the moment generating function, $ \phi\left(s\right)=E\left[e^{s\mathbf{X}}\right] $ . Note: $ \phi\left(s\right) $ can be related to the Laplace Transform of $ f_{\mathbf{X}}\left(x\right) $ .

Click here to view student answers and discussions

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang