(One intermediate revision by one other user not shown)
Line 13: Line 13:
 
'''(a) (7 pts)'''
 
'''(a) (7 pts)'''
  
Let <math>A</math>  and <math>B</math>  be statistically independent events in the same probability space. Are <math>A</math>  and <math>B^{C}</math>  independent? (You must prove your result).
+
Let <math class="inline">A</math>  and <math class="inline">B</math>  be statistically independent events in the same probability space. Are <math class="inline">A</math>  and <math class="inline">B^{C}</math>  independent? (You must prove your result).
  
<math>P\left(A\right)=P\left(A\cap\left(B\cup B^{C}\right)\right)=P\left(\left(A\cap B\right)\cup\left(A\cap B^{C}\right)\right)=P\left(A\cap B\right)+P\left(A\cap B^{C}\right)=P\left(A\right)P\left(B\right)+P\left(A\cap B^{C}\right).</math>  
+
<math class="inline">P\left(A\right)=P\left(A\cap\left(B\cup B^{C}\right)\right)=P\left(\left(A\cap B\right)\cup\left(A\cap B^{C}\right)\right)=P\left(A\cap B\right)+P\left(A\cap B^{C}\right)=P\left(A\right)P\left(B\right)+P\left(A\cap B^{C}\right).</math>  
  
<math>P\left(A\cap B^{C}\right)=P\left(A\right)-P\left(A\right)P\left(B\right)=P\left(A\right)\left(1-P\left(B\right)\right)=P\left(A\right)P\left(B^{C}\right).</math>  
+
<math class="inline">P\left(A\cap B^{C}\right)=P\left(A\right)-P\left(A\right)P\left(B\right)=P\left(A\right)\left(1-P\left(B\right)\right)=P\left(A\right)P\left(B^{C}\right).</math>  
  
<math>\therefore A\text{ and }B^{C}\text{ are independent. }</math>  
+
<math class="inline">\therefore A\text{ and }B^{C}\text{ are independent. }</math>  
  
 
'''(b) (7 pts)'''
 
'''(b) (7 pts)'''
Line 25: Line 25:
 
Can two events be statistically independent and mutually exclusive? (You must derive the conditions on A  and B  for this to be true or not.)
 
Can two events be statistically independent and mutually exclusive? (You must derive the conditions on A  and B  for this to be true or not.)
  
If <math>P\left(A\right)=0</math>  or <math>P\left(B\right)=0</math> , then A  and B  are statistically independent and mutually exclusive. Prove this:
+
If <math class="inline">P\left(A\right)=0</math>  or <math class="inline">P\left(B\right)=0</math> , then A  and B  are statistically independent and mutually exclusive. Prove this:
  
Without loss of generality, suppose that <math>P\left(A\right)=0</math> . <math>0=P\left(A\right)\geq P\left(A\cap B\right)\geq0\Longrightarrow P\left(A\cap B\right)=0\qquad\therefore\text{mutually excclusive}.</math>  
+
Without loss of generality, suppose that <math class="inline">P\left(A\right)=0</math> . <math class="inline">0=P\left(A\right)\geq P\left(A\cap B\right)\geq0\Longrightarrow P\left(A\cap B\right)=0\qquad\therefore\text{mutually excclusive}.</math>  
  
<math>P\left(A\cap B\right)=0=P\left(A\right)P\left(B\right)\qquad\therefore\text{statistically independent.}</math>  
+
<math class="inline">P\left(A\cap B\right)=0=P\left(A\right)P\left(B\right)\qquad\therefore\text{statistically independent.}</math>  
  
 
'''(c) (6 pts)'''
 
'''(c) (6 pts)'''
Line 41: Line 41:
 
'''3. (20 pts)'''
 
'''3. (20 pts)'''
  
Let the <math>\mathbf{X}_{1},\mathbf{X}_{2},\cdots</math>  be a sequence of random variables that converge in mean square to the random variable <math>\mathbf{X}</math> . Does the sequence also converge to <math>\mathbf{X}</math>  in probability? (A simple yes or no answer is not acceptable, you must derive the result.)
+
Let the <math class="inline">\mathbf{X}_{1},\mathbf{X}_{2},\cdots</math>  be a sequence of random variables that converge in mean square to the random variable <math class="inline">\mathbf{X}</math> . Does the sequence also converge to <math class="inline">\mathbf{X}</math>  in probability? (A simple yes or no answer is not acceptable, you must derive the result.)
  
We know that <math>E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]\rightarrow0</math>  as <math>n\rightarrow\infty</math> .
+
We know that <math class="inline">E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]\rightarrow0</math>  as <math class="inline">n\rightarrow\infty</math> .
  
 
By using [[ECE 600 Chebyshev Inequality|Chebyshev Inequality]],
 
By using [[ECE 600 Chebyshev Inequality|Chebyshev Inequality]],
  
<math>\lim_{n\rightarrow\infty}P\left(\left\{ \mathbf{X}-\mathbf{X}_{n}\right\} \geq\epsilon\right)\leq\lim_{n\rightarrow\infty}\left(\frac{E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}\right)=\frac{\lim_{n\rightarrow\infty}E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}=0.</math>  
+
<math class="inline">\lim_{n\rightarrow\infty}P\left(\left\{ \mathbf{X}-\mathbf{X}_{n}\right\} \geq\epsilon\right)\leq\lim_{n\rightarrow\infty}\left(\frac{E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}\right)=\frac{\lim_{n\rightarrow\infty}E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}=0.</math>  
  
<math>\therefore</math>  A sequence of random variable that converge in mean square sense to the random variable <math>\mathbf{X}</math> , also converges in probability to <math>\mathbf{X}</math> .
+
<math class="inline">\therefore</math>  A sequence of random variable that converge in mean square sense to the random variable <math class="inline">\mathbf{X}</math> , also converges in probability to <math class="inline">\mathbf{X}</math> .
  
 
'''4. (20 pts)'''
 
'''4. (20 pts)'''
  
Let <math>\mathbf{X}_{t}</math>  be a band-limited white noise strictly stationary random process with bandwidth 10 KHz. It is also known that <math>\mathbf{X}_{t}</math>  is uniformly distributed between <math>\pm5</math>  volts. Find:
+
Let <math class="inline">\mathbf{X}_{t}</math>  be a band-limited white noise strictly stationary random process with bandwidth 10 KHz. It is also known that <math class="inline">\mathbf{X}_{t}</math>  is uniformly distributed between <math class="inline">\pm5</math>  volts. Find:
  
 
'''(a) (10 pts)'''
 
'''(a) (10 pts)'''
  
Let <math>\mathbf{Y}_{t}=\left(\mathbf{X}_{t}\right)^{2}</math> . Find the mean square value of <math>\mathbf{Y}_{t}</math> .
+
Let <math class="inline">\mathbf{Y}_{t}=\left(\mathbf{X}_{t}\right)^{2}</math> . Find the mean square value of <math class="inline">\mathbf{Y}_{t}</math> .
  
 
'''(b) (10 pts)'''
 
'''(b) (10 pts)'''
  
Let <math>\mathbf{X}_{t}</math>  be the input to a linear shift-invariant system with transfer function:  
+
Let <math class="inline">\mathbf{X}_{t}</math>  be the input to a linear shift-invariant system with transfer function:  
 
<br>
 
<br>
<math>H\left(f\right)=\begin{cases}
+
<math class="inline">H\left(f\right)=\begin{cases}
 
\begin{array}{lll}
 
\begin{array}{lll}
 
1    \text{      for }\left|f\right|\leq5\text{ KHz}\\
 
1    \text{      for }\left|f\right|\leq5\text{ KHz}\\
Line 74: Line 74:
 
'''5. (20 pts)'''  
 
'''5. (20 pts)'''  
  
Let a linear discrete parameter shift-invariant system have the following difference equation: <math>y\left(n\right)=0.7y\left(n-1\right)+x\left(n\right)</math> where <math>x\left(n\right)</math>  in the input and <math>y\left(n\right)</math>  is the output. Now suppose this system has as its input the discrete parameter random process <math>\mathbf{X}_{n}</math> . You may assume that the input process is zero-mean i.i.d.  
+
Let a linear discrete parameter shift-invariant system have the following difference equation: <math class="inline">y\left(n\right)=0.7y\left(n-1\right)+x\left(n\right)</math> where <math class="inline">x\left(n\right)</math>  in the input and <math class="inline">y\left(n\right)</math>  is the output. Now suppose this system has as its input the discrete parameter random process <math class="inline">\mathbf{X}_{n}</math> . You may assume that the input process is zero-mean i.i.d.  
  
 
'''(a) (5 pts)'''
 
'''(a) (5 pts)'''
Line 80: Line 80:
 
Is the input wide-sense stationary (show your work)?
 
Is the input wide-sense stationary (show your work)?
  
<math>E\left[\mathbf{X}_{n}\right]=0.</math>  
+
<math class="inline">E\left[\mathbf{X}_{n}\right]=0.</math>  
  
<math>R_{\mathbf{XX}}\left(n+m,\; n\right)</math>  
+
<math class="inline">R_{\mathbf{XX}}\left(n+m,\; n\right)</math>  
  
<math>\therefore\;\mathbf{X}_{n}\text{ is wide-sense stationary.}</math>  
+
<math class="inline">\therefore\;\mathbf{X}_{n}\text{ is wide-sense stationary.}</math>  
  
 
'''(b) (5 pts)'''
 
'''(b) (5 pts)'''
Line 90: Line 90:
 
Is the output process wide-sense stationary (show your work)?
 
Is the output process wide-sense stationary (show your work)?
  
<math>E\left[\mathbf{Y}_{n}\right]=0.7E\left[\mathbf{Y}_{n-1}\right]+E\left[\mathbf{X}_{n}\right]=0.7E\left[\mathbf{Y}_{n-1}\right]=0.7^{2}E\left[\mathbf{Y}_{n-2}\right]=0.7^{n}E\left[\mathbf{Y}_{0}\right]=0.</math>  
+
<math class="inline">E\left[\mathbf{Y}_{n}\right]=0.7E\left[\mathbf{Y}_{n-1}\right]+E\left[\mathbf{X}_{n}\right]=0.7E\left[\mathbf{Y}_{n-1}\right]=0.7^{2}E\left[\mathbf{Y}_{n-2}\right]=0.7^{n}E\left[\mathbf{Y}_{0}\right]=0.</math>  
  
<math>E\left[\mathbf{Y}_{0}\right]=E\left[\sum_{n=-\infty}^{\infty}h\left(0-n\right)\mathbf{X}\left(n\right)\right]=\sum_{n=-\infty}^{\infty}h\left(-n\right)E\left[\mathbf{X}\left(n\right)\right]=0.</math>  
+
<math class="inline">E\left[\mathbf{Y}_{0}\right]=E\left[\sum_{n=-\infty}^{\infty}h\left(0-n\right)\mathbf{X}\left(n\right)\right]=\sum_{n=-\infty}^{\infty}h\left(-n\right)E\left[\mathbf{X}\left(n\right)\right]=0.</math>  
  
<math>R_{\mathbf{YY}}\left(n+m,\; n\right)</math>  
+
<math class="inline">R_{\mathbf{YY}}\left(n+m,\; n\right)</math>  
  
<math>R_{\mathbf{YY}}\left(n+m,\; n\right)</math>  depends on the time difference <math>m</math> . Thus, <math>\mathbf{Y}_{n}</math>  is wide-sense stationary.  
+
<math class="inline">R_{\mathbf{YY}}\left(n+m,\; n\right)</math>  depends on the time difference <math class="inline">m</math> . Thus, <math class="inline">\mathbf{Y}_{n}</math>  is wide-sense stationary.  
  
 
'''(c) (5 pts)'''
 
'''(c) (5 pts)'''
Line 102: Line 102:
 
Find the autocorrelation function of the input process.
 
Find the autocorrelation function of the input process.
  
<math>R_{\mathbf{XX}}\left(n,n+m\right)=R_{\mathbf{X}}\left(m\right)=\sigma_{\mathbf{X}}^{2}\delta\left(m\right).</math>  
+
<math class="inline">R_{\mathbf{XX}}\left(n,n+m\right)=R_{\mathbf{X}}\left(m\right)=\sigma_{\mathbf{X}}^{2}\delta\left(m\right).</math>  
  
 
'''(d) (5 pts)'''
 
'''(d) (5 pts)'''
Line 108: Line 108:
 
Find the autocorrelation function, in closed form, for the output process.
 
Find the autocorrelation function, in closed form, for the output process.
  
<math>R_{\mathbf{Y}}\left(m\right)</math>  
+
<math class="inline">R_{\mathbf{Y}}\left(m\right)</math>  
  
<math>\because\; E\left[\mathbf{X}\left(n\right)\mathbf{Y}\left(m\right)\right]=E\left[\sum_{k=-\infty}^{\infty}h\left(m-k\right)\mathbf{X}\left(n\right)\mathbf{X}\left(k\right)\right]=\sum_{k=-\infty}^{\infty}h\left(m-k\right)\left(\sigma_{\mathbf{X}}^{2}\delta\left(n-k\right)\right).</math>  
+
<math class="inline">\because\; E\left[\mathbf{X}\left(n\right)\mathbf{Y}\left(m\right)\right]=E\left[\sum_{k=-\infty}^{\infty}h\left(m-k\right)\mathbf{X}\left(n\right)\mathbf{X}\left(k\right)\right]=\sum_{k=-\infty}^{\infty}h\left(m-k\right)\left(\sigma_{\mathbf{X}}^{2}\delta\left(n-k\right)\right).</math>  
  
 
----
 
----
 
[[ECE600|Back to ECE600]]
 
[[ECE600|Back to ECE600]]
  
[[ECE 600 QE|Back to ECE 600 QE]]
+
[[ECE 600 QE|Back to my ECE 600 QE page]]
 +
 
 +
[[ECE_PhD_Qualifying_Exams|Back to the general ECE PHD QE page]] (for problem discussion)

Latest revision as of 07:23, 27 June 2012

7.2 QE 2001 January

1. (20 pts)

State and prove the Tchebycheff Inequality.

Answer

You can see the proof of the Chebyshev Inequality.

2.

(a) (7 pts)

Let $ A $ and $ B $ be statistically independent events in the same probability space. Are $ A $ and $ B^{C} $ independent? (You must prove your result).

$ P\left(A\right)=P\left(A\cap\left(B\cup B^{C}\right)\right)=P\left(\left(A\cap B\right)\cup\left(A\cap B^{C}\right)\right)=P\left(A\cap B\right)+P\left(A\cap B^{C}\right)=P\left(A\right)P\left(B\right)+P\left(A\cap B^{C}\right). $

$ P\left(A\cap B^{C}\right)=P\left(A\right)-P\left(A\right)P\left(B\right)=P\left(A\right)\left(1-P\left(B\right)\right)=P\left(A\right)P\left(B^{C}\right). $

$ \therefore A\text{ and }B^{C}\text{ are independent. } $

(b) (7 pts)

Can two events be statistically independent and mutually exclusive? (You must derive the conditions on A and B for this to be true or not.)

If $ P\left(A\right)=0 $ or $ P\left(B\right)=0 $ , then A and B are statistically independent and mutually exclusive. Prove this:

Without loss of generality, suppose that $ P\left(A\right)=0 $ . $ 0=P\left(A\right)\geq P\left(A\cap B\right)\geq0\Longrightarrow P\left(A\cap B\right)=0\qquad\therefore\text{mutually excclusive}. $

$ P\left(A\cap B\right)=0=P\left(A\right)P\left(B\right)\qquad\therefore\text{statistically independent.} $

(c) (6 pts)

State the Axioms of Probability.

Answer

You can see the Axioms of Probability.

3. (20 pts)

Let the $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots $ be a sequence of random variables that converge in mean square to the random variable $ \mathbf{X} $ . Does the sequence also converge to $ \mathbf{X} $ in probability? (A simple yes or no answer is not acceptable, you must derive the result.)

We know that $ E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]\rightarrow0 $ as $ n\rightarrow\infty $ .

By using Chebyshev Inequality,

$ \lim_{n\rightarrow\infty}P\left(\left\{ \mathbf{X}-\mathbf{X}_{n}\right\} \geq\epsilon\right)\leq\lim_{n\rightarrow\infty}\left(\frac{E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}\right)=\frac{\lim_{n\rightarrow\infty}E\left[\left|\mathbf{X}-\mathbf{X}_{n}\right|^{2}\right]}{\epsilon^{2}}=0. $

$ \therefore $ A sequence of random variable that converge in mean square sense to the random variable $ \mathbf{X} $ , also converges in probability to $ \mathbf{X} $ .

4. (20 pts)

Let $ \mathbf{X}_{t} $ be a band-limited white noise strictly stationary random process with bandwidth 10 KHz. It is also known that $ \mathbf{X}_{t} $ is uniformly distributed between $ \pm5 $ volts. Find:

(a) (10 pts)

Let $ \mathbf{Y}_{t}=\left(\mathbf{X}_{t}\right)^{2} $ . Find the mean square value of $ \mathbf{Y}_{t} $ .

(b) (10 pts)

Let $ \mathbf{X}_{t} $ be the input to a linear shift-invariant system with transfer function:
$ H\left(f\right)=\begin{cases} \begin{array}{lll} 1 \text{ for }\left|f\right|\leq5\text{ KHz}\\ 0.5 \text{ for }5\text{ KHz}\leq\left|f\right|\leq50\text{ KHz}\\ 0 \text{ elsewhere. } \end{array}\end{cases} $

Find the mean and variance of the output. 5. (20 pts)

Let a linear discrete parameter shift-invariant system have the following difference equation: $ y\left(n\right)=0.7y\left(n-1\right)+x\left(n\right) $ where $ x\left(n\right) $ in the input and $ y\left(n\right) $ is the output. Now suppose this system has as its input the discrete parameter random process $ \mathbf{X}_{n} $ . You may assume that the input process is zero-mean i.i.d.

(a) (5 pts)

Is the input wide-sense stationary (show your work)?

$ E\left[\mathbf{X}_{n}\right]=0. $

$ R_{\mathbf{XX}}\left(n+m,\; n\right) $

$ \therefore\;\mathbf{X}_{n}\text{ is wide-sense stationary.} $

(b) (5 pts)

Is the output process wide-sense stationary (show your work)?

$ E\left[\mathbf{Y}_{n}\right]=0.7E\left[\mathbf{Y}_{n-1}\right]+E\left[\mathbf{X}_{n}\right]=0.7E\left[\mathbf{Y}_{n-1}\right]=0.7^{2}E\left[\mathbf{Y}_{n-2}\right]=0.7^{n}E\left[\mathbf{Y}_{0}\right]=0. $

$ E\left[\mathbf{Y}_{0}\right]=E\left[\sum_{n=-\infty}^{\infty}h\left(0-n\right)\mathbf{X}\left(n\right)\right]=\sum_{n=-\infty}^{\infty}h\left(-n\right)E\left[\mathbf{X}\left(n\right)\right]=0. $

$ R_{\mathbf{YY}}\left(n+m,\; n\right) $

$ R_{\mathbf{YY}}\left(n+m,\; n\right) $ depends on the time difference $ m $ . Thus, $ \mathbf{Y}_{n} $ is wide-sense stationary.

(c) (5 pts)

Find the autocorrelation function of the input process.

$ R_{\mathbf{XX}}\left(n,n+m\right)=R_{\mathbf{X}}\left(m\right)=\sigma_{\mathbf{X}}^{2}\delta\left(m\right). $

(d) (5 pts)

Find the autocorrelation function, in closed form, for the output process.

$ R_{\mathbf{Y}}\left(m\right) $

$ \because\; E\left[\mathbf{X}\left(n\right)\mathbf{Y}\left(m\right)\right]=E\left[\sum_{k=-\infty}^{\infty}h\left(m-k\right)\mathbf{X}\left(n\right)\mathbf{X}\left(k\right)\right]=\sum_{k=-\infty}^{\infty}h\left(m-k\right)\left(\sigma_{\mathbf{X}}^{2}\delta\left(n-k\right)\right). $


Back to ECE600

Back to my ECE 600 QE page

Back to the general ECE PHD QE page (for problem discussion)

Alumni Liaison

ECE462 Survivor

Seraj Dosenbach