(One intermediate revision by the same user not shown)
Line 23: Line 23:
 
==Question==
 
==Question==
  
'''Problem 1 (30 points)'''
+
'''1. (15% of Total)'''
  
'''i)'''
+
This question is a set of short-answer questions (no proofs):
  
Let <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  be jointly Gaussian (normal) distributed random variables with mean <math class="inline">0</math> , <math class="inline">E\left[\mathbf{X}^{2}\right]=E\left[\mathbf{Y}^{2}\right]=\sigma^{2}</math>  and <math class="inline">E\left[\mathbf{XY}\right]=\rho\sigma^{2}</math>  with <math class="inline">\left|\rho\right|<1</math> . Find the joint characteristic function <math class="inline">E\left[e^{i\left(h_{1}\mathbf{X}+h_{2}\mathbf{Y}\right)}\right]</math> .
+
'''(a) (5%)'''
  
'''ii)'''
+
State the definition of a Probability Space.
  
Let <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  be two jointly Gaussian distributed r.v's with identical means and variances but are not necessarily independent. Show that the r.v. <math class="inline">\mathbf{V}=\mathbf{X}+\mathbf{Y}</math>  is independeent of the r.v. <math class="inline">\mathbf{W}=\mathbf{X}-\mathbf{Y}</math> . Is the same answer true for <math class="inline">\mathbf{A}=f\left(\mathbf{V}\right)</math>  and <math class="inline">\mathbf{B}=g\left(\mathbf{W}\right)</math>  where <math class="inline">f\left(\cdot\right)</math>  and <math class="inline">g\left(\cdot\right)</math>  are suitable functions such that <math class="inline">E\left[f\left(\mathbf{V}\right)\right]<\infty</math>  and <math class="inline">E\left[g\left(\mathbf{W}\right)\right]<\infty</math> . Given reasons.
+
'''(b) (5%)'''
  
'''iii)'''
+
State the definition of a random variable; use notation from your answer in part (a).
  
Let <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  be independent <math class="inline">N\left(m,1\right)</math>  random variables. Show that the sample mean <math class="inline">\mathbf{M}=\frac{\mathbf{X}+\mathbf{Y}}{2}</math>  is independent of the sample variance <math class="inline">\mathbf{V}=\left(\mathbf{X}-\mathbf{M}\right)^{2}+\left(\mathbf{Y}-\mathbf{M}\right)^{2}</math> .
+
'''(c) (5%)'''
  
 +
State the Strong Law of Large Numbers.
 
:'''Click [[ECE_PhD_QE_CNSIP_2003_Problem1.1|here]] to view student [[ECE_PhD_QE_CNSIP_2003_Problem1.1|answers and discussions]]'''
 
:'''Click [[ECE_PhD_QE_CNSIP_2003_Problem1.1|here]] to view student [[ECE_PhD_QE_CNSIP_2003_Problem1.1|answers and discussions]]'''
 
----
 
----
'''Problem 2 (35 points)'''
+
'''2. (15% of Total)'''
  
Consider the stochastic process <math class="inline">\left\{ \mathbf{X}_{n}\right\}</math>   defined by: <math class="inline">\mathbf{X}_{n+1}=a\mathbf{X}_{n}+b\mathbf{W}_{n} where \mathbf{X}_{0}\sim N\left(0,\sigma^{2}\right)</math> and <math class="inline">\left\{ \mathbf{W}_{n}\right\}</math>   is an i.i.d.  <math class="inline">N\left(0,1\right)</math> sequence of r.v's independent of <math class="inline">\mathbf{X}_{0}</math> .
+
You want to simulate outcomes for an exponential random variable <math class="inline">\mathbf{X}</math> with mean <math class="inline">1/\lambda</math> . You have a random number generator that produces outcomes for a random variable <math class="inline">\mathbf{Y}</math> that is uniformly distributed on the interval <math class="inline">\left(0,1\right)</math> . What transformation applied to <math class="inline">\mathbf{Y}</math>  will yield the desired distribution for <math class="inline">\mathbf{X}</math> ? Prove your answer.
  
'''i)'''
 
  
Show that if <math class="inline">R_{k}=cov\left(\mathbf{X}_{k},\mathbf{X}_{k}\right)</math>  converges as <math class="inline">k\rightarrow\infty</math> , then <math class="inline">\left\{ \mathbf{X}_{k}\right\}</math>  converges to a w.s.s. process.
+
:'''Click [[ECE_PhD_QE_CNSIP_2003_Problem1.2|here]] to view student [[ECE_PhD_QE_CNSIP_2003_Problem1.2|answers and discussions]]'''
 +
----
 +
'''3. (20% of Total)'''
  
'''ii)'''
+
Consider three independent random variables, <math class="inline">\mathbf{X}</math> , <math class="inline">\mathbf{Y}</math> , and <math class="inline">\mathbf{Z}</math> . Assume that each one is uniformly distributed over the interval <math class="inline">\left(0,1\right)</math> . Call “Bin #1” the interval <math class="inline">\left(0,\mathbf{X}\right)</math> , and “Bin #2” the interval <math class="inline">\left(\mathbf{X},1\right)</math> .
  
Show that if <math class="inline">\sigma^{2}</math>  is chosen appropriately and <math class="inline">\left|a\right|<1</math> , then <math class="inline">\left\{ \mathbf{X}_{k}\right\}</math>  will be a stationary process for all <math class="inline">k</math> .
+
'''a. (10%)'''
  
'''iii)'''
+
Find the probability that <math class="inline">\mathbf{Y}</math>  falls into Bin #1 (that is, <math class="inline">\mathbf{Y}<\mathbf{X}</math> ). Show your work.
  
If <math class="inline">\left|a\right|>1</math> , show that the variance of the process <math class="inline">\left\{ \mathbf{X}_{k}\right\}</math>   diverges but <math class="inline">\frac{\mathbf{X}_{k}}{\left|a\right|^{k}}</math>  converges in the mean square.
+
'''b. (10%)'''
 +
 
 +
Find the probability that both <math class="inline">\mathbf{Y}</math> and <math class="inline">\mathbf{Z}</math>  fall into Bin #1. Show your work.
 +
 
 +
:'''Click [[ECE_PhD_QE_CNSIP_2003_Problem1.3|here]] to view student [[ECE_PhD_QE_CNSIP_2003_Problem1.3|answers and discussions]]'''
  
:'''Click [[ECE_PhD_QE_CNSIP_2003_Problem1.2|here]] to view student [[ECE_PhD_QE_CNSIP_2003_Problem1.2|answers and discussions]]'''
 
 
----
 
----
'''Problem 3 (35 points)'''
+
'''4. (25% of Total)'''
  
'''i)'''
+
Let <math class="inline">\mathbf{X}_{n},\; n=1,2,\cdots</math> , be a zero mean, discrete-time, white noise process with <math class="inline">E\left(\mathbf{X}_{n}^{2}\right)=1</math>  for all <math class="inline">n</math> . Let <math class="inline">\mathbf{Y}_{0}</math>  be a random variable that is independent of the sequence <math class="inline">\left\{ \mathbf{X}_{n}\right\}</math>  , has mean <math class="inline">0</math> , and has variance <math class="inline">\sigma^{2}</math> . Define <math class="inline">\mathbf{Y}_{n},\; n=1,2,\cdots</math> , to be an autoregressive process as follows: <math class="inline">\mathbf{Y}_{n}=\frac{1}{3}\mathbf{Y}_{n-1}+\mathbf{X}_{n}.</math>
  
Catastrophes occur at times <math class="inline">\mathbf{T}_{1},\mathbf{T}_{2},\cdots</math>,  where <math class="inline">\mathbf{T}_{i}=\sum_{k=1}^{i}\mathbf{X}_{k}</math>  where the <math class="inline">\mathbf{X}_{k}</math> 's are independent, identically distributed positive random variables. Let <math class="inline">\mathbf{N}_{t}=\max\left\{ n:\mathbf{T}_{n}\leq t\right\}</math>  be the number of catastrophes which have occurred by time <math class="inline">t</math> . Show that if <math class="inline">E\left[\mathbf{X}_{1}\right]<\infty</math>  then <math class="inline">\mathbf{N}_{t}\rightarrow\infty</math>  almost surely (a.s.) and <math class="inline">\frac{\mathbf{N}_{t}}{t}\rightarrow\frac{1}{E\left[\mathbf{X}_{1}\right]}</math>  as <math class="inline">t\rightarrow\infty</math>  a.s.
+
'''a. (20 %)'''
  
'''ii)'''
+
Show that <math class="inline">\mathbf{Y}_{n}</math>  is asymptotically wide sense stationary and find its steady state mean and autocorrelation function.
  
Let <math class="inline">\left\{ \mathbf{X}_{t},t\geq0\right\}</math>  be a stochastic process defined by: <math class="inline">\mathbf{X}_{t}=\sqrt{2}\cos\left(2\pi\xi t\right)</math> where <math class="inline">\xi</math>  is a <math class="inline">N\left(0,1\right)</math>  random variable. Show that as <math class="inline">t\rightarrow\infty,\;\left\{ \mathbf{X}_{t}\right\}</math>  converges to a wide sense stationary process. Find the spectral density of the limit process.
+
'''b. (5%)'''
  
'''Hint:'''
+
For what choice of <math class="inline">\sigma^{2}</math>  is the process wide sense stationary; i.e., not just asymptotically wide sense stationary?
  
Use the fact that the characteristic function of a <math class="inline">N\left(0,1\right)</math>  r.v. is given by <math class="inline">E\left[e^{ih\mathbf{X}}\right]=e^{-\frac{h^{2}}{2}}</math> .
+
:'''Click [[ECE_PhD_QE_CNSIP_2003_Problem1.4|here]] to view student [[ECE_PhD_QE_CNSIP_2003_Problem1.4|answers and discussions]]'''
 +
----
 +
'''5. (25% of Total)'''
  
:'''Click [[ECE_PhD_QE_CNSIP_2003_Problem1.3|here]] to view student [[ECE_PhD_QE_CNSIP_2003_Problem1.3|answers and discussions]]'''
+
Suppose that “sensor nodes” are spread around the ground (two-dimensional space) according to a Poisson Process, with an average density of nodes per unit area of <math class="inline">\lambda</math> . We are interested in the number and location of nodes inside a circle <math class="inline">C</math>  of radius one that is centered at the origin. You must quote, but do not have to prove, properties of the Poisson process that you use in your solutions to the following questions:
 +
 
 +
'''a. (10%)'''
 +
 
 +
Given that a node is in the circle C , determine the density or distribution function of its distance <math class="inline">\mathbf{D}</math>  from the origin.
 +
 
 +
'''b. (15%)'''
 +
 
 +
Find the density or distribution of the distance from the center of <math class="inline">C</math>  to the node inside <math class="inline">C</math>  that is closest to the origin.
 +
 
 +
:'''Click [[ECE_PhD_QE_CNSIP_2003_Problem1.5|here]] to view student [[ECE_PhD_QE_CNSIP_2003_Problem1.5|answers and discussions]]'''
 
----
 
----
 
----
 
----
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]
 
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]]

Latest revision as of 23:51, 9 March 2015


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2003



Question

1. (15% of Total)

This question is a set of short-answer questions (no proofs):

(a) (5%)

State the definition of a Probability Space.

(b) (5%)

State the definition of a random variable; use notation from your answer in part (a).

(c) (5%)

State the Strong Law of Large Numbers.

Click here to view student answers and discussions

2. (15% of Total)

You want to simulate outcomes for an exponential random variable $ \mathbf{X} $ with mean $ 1/\lambda $ . You have a random number generator that produces outcomes for a random variable $ \mathbf{Y} $ that is uniformly distributed on the interval $ \left(0,1\right) $ . What transformation applied to $ \mathbf{Y} $ will yield the desired distribution for $ \mathbf{X} $ ? Prove your answer.


Click here to view student answers and discussions

3. (20% of Total)

Consider three independent random variables, $ \mathbf{X} $ , $ \mathbf{Y} $ , and $ \mathbf{Z} $ . Assume that each one is uniformly distributed over the interval $ \left(0,1\right) $ . Call “Bin #1” the interval $ \left(0,\mathbf{X}\right) $ , and “Bin #2” the interval $ \left(\mathbf{X},1\right) $ .

a. (10%)

Find the probability that $ \mathbf{Y} $ falls into Bin #1 (that is, $ \mathbf{Y}<\mathbf{X} $ ). Show your work.

b. (10%)

Find the probability that both $ \mathbf{Y} $ and $ \mathbf{Z} $ fall into Bin #1. Show your work.

Click here to view student answers and discussions

4. (25% of Total)

Let $ \mathbf{X}_{n},\; n=1,2,\cdots $ , be a zero mean, discrete-time, white noise process with $ E\left(\mathbf{X}_{n}^{2}\right)=1 $ for all $ n $ . Let $ \mathbf{Y}_{0} $ be a random variable that is independent of the sequence $ \left\{ \mathbf{X}_{n}\right\} $ , has mean $ 0 $ , and has variance $ \sigma^{2} $ . Define $ \mathbf{Y}_{n},\; n=1,2,\cdots $ , to be an autoregressive process as follows: $ \mathbf{Y}_{n}=\frac{1}{3}\mathbf{Y}_{n-1}+\mathbf{X}_{n}. $

a. (20 %)

Show that $ \mathbf{Y}_{n} $ is asymptotically wide sense stationary and find its steady state mean and autocorrelation function.

b. (5%)

For what choice of $ \sigma^{2} $ is the process wide sense stationary; i.e., not just asymptotically wide sense stationary?

Click here to view student answers and discussions

5. (25% of Total)

Suppose that “sensor nodes” are spread around the ground (two-dimensional space) according to a Poisson Process, with an average density of nodes per unit area of $ \lambda $ . We are interested in the number and location of nodes inside a circle $ C $ of radius one that is centered at the origin. You must quote, but do not have to prove, properties of the Poisson process that you use in your solutions to the following questions:

a. (10%)

Given that a node is in the circle C , determine the density or distribution function of its distance $ \mathbf{D} $ from the origin.

b. (15%)

Find the density or distribution of the distance from the center of $ C $ to the node inside $ C $ that is closest to the origin.

Click here to view student answers and discussions


Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Ph.D. on Applied Mathematics in Aug 2007. Involved on applications of image super-resolution to electron microscopy

Francisco Blanco-Silva