(New page: 6.1 MRB 1992 Final 1. (15 pts.) Let <math>\mathbf{X}_{1},\cdots,\mathbf{X}_{n},\cdots</math> be a sequence of independent, identically distributed random variables, each having pdf <ma...) |
|||
Line 1: | Line 1: | ||
− | 6.1 MRB 1992 Final | + | =6.1 MRB 1992 Final= |
1. (15 pts.) | 1. (15 pts.) | ||
Line 97: | Line 97: | ||
<math>\therefore \left(\mathbf{Y}\left(t_{1}\right),\mathbf{Y}\left(t_{2}\right)\right)</math> has distribution <math>N\left[0,0,\sqrt{S_{0}\frac{\pi}{\sigma}},\sqrt{S_{0}\frac{\pi}{\sigma}},\frac{\sin\sigma\left(t_{1}-t_{2}\right)}{\sigma\left(t_{1}-t_{2}\right)}\right]</math> . | <math>\therefore \left(\mathbf{Y}\left(t_{1}\right),\mathbf{Y}\left(t_{2}\right)\right)</math> has distribution <math>N\left[0,0,\sqrt{S_{0}\frac{\pi}{\sigma}},\sqrt{S_{0}\frac{\pi}{\sigma}},\frac{\sin\sigma\left(t_{1}-t_{2}\right)}{\sigma\left(t_{1}-t_{2}\right)}\right]</math> . | ||
+ | |||
+ | ---- | ||
+ | [[ECE600|Back to ECE600]] | ||
+ | |||
+ | [[ECE 600 Finals|Back to ECE 600 Finals]] |
Revision as of 11:34, 21 November 2010
6.1 MRB 1992 Final
1. (15 pts.)
Let $ \mathbf{X}_{1},\cdots,\mathbf{X}_{n},\cdots $ be a sequence of independent, identically distributed random variables, each having pdf
$ f_{\mathbf{X}}\left(x\right)=\frac{1}{\mu}\exp\left\{ -\frac{x}{\mu}\right\} \cdot\mathbf{1}_{\left[0,\infty\right)}\left(x\right). $ Let $ Y_{n} $ be a new random variable defined by
$ \mathbf{Y}_{n}=\min\left\{ \mathbf{X}_{1},\cdots,\mathbf{X}_{n}\right\} $ .
Note
$ \mathbf{X} $ is the exponentially distributed random variable.
(a)
Find the pdf of $ \mathbf{Y}_{n} $ .
• The cdf of $ \mathbf{Y}_{n} $ is
$ F_{\mathbf{Y}_{n}}\left(y\right)=P\left(\left\{ \mathbf{Y}_{n}\leq y\right\} \right)=1-P\left(\left\{ \mathbf{Y}_{n}>y\right\} \right) $$ =1-P\left(\left\{ \mathbf{X}_{1}>y\right\} \cap\left\{ \mathbf{X}_{2}>y\right\} \cap\cdots\cap\left\{ \mathbf{X}_{n}>y\right\} \right) $$ =1-\prod_{k=1}^{n}\left(1-F_{\mathbf{X}_{k}}\left(y\right)\right)=1-\left(1-F_{\mathbf{X}}\left(y\right)\right)^{n} $$ =\left[1-\exp\left\{ -y/\left(\mu/n\right)\right\} \right]\cdot\mathbf{1}_{\left[0,\infty\right)}\left(y\right). $
• The pdf of $ \mathbf{Y}_{n} $ that is derived from the cdf of $ \mathbf{Y}_{n} $ is $ f_{\mathbf{Y}_{n}}\left(y\right)=\frac{dF_{\mathbf{Y}_{n}}\left(y\right)}{dy}=\frac{1}{\left(\mu/n\right)}\exp\left\{ \frac{-y}{\left(\mu/n\right)}\right\} \cdot\mathbf{1}_{\left[0,\infty\right)}\left(y\right). $
• $ \mathbf{Y}_{n} $ is exponentially distributed with mean $ \mu/n $ .
• $ E\left[\mathbf{Y}_{n}\right]=\mu/n $ , $ E\left[\mathbf{Y}_{n}^{2}\right]=\int_{0}^{\infty}\frac{y^{2}}{\left(\mu/n\right)}\exp\left\{ \frac{-y}{\left(\mu/n\right)}\right\} dy=\frac{2\mu^{2}}{n^{2}} $ , and $ \sigma_{\mathbf{Y}_{n}}^{2}=E\left[\mathbf{Y}^{2}\right]-\left(E\left[\mathbf{Y}\right]\right)^{2}=\frac{2\mu^{2}}{n^{2}}-\frac{\mu^{2}}{n^{2}}=\frac{\mu^{2}}{n^{2}}. $
– In fact, we directly know that $ \sigma_{\mathbf{Y}_{n}}^{2}=\frac{\mu^{2}}{n^{2}} $ because we know that $ \mathbf{Y}_{n} $ is exponetially distributed.
• Thus as $ n\rightarrow\infty $ , $ E\left[\mathbf{Y}_{n}\right]\rightarrow0 $ and $ \sigma_{\mathbf{Y}_{n}}^{2}\rightarrow0 $ . So we suspect that $ \mathbf{Y}_{n}\rightarrow0 $ . Let's check to see if this is true in $ \left(p\right) $ and $ \left(m.s.\right) $ .
(b)
Does the sequence of random variables $ \left\{ \mathbf{Y}_{n}\right\} $ converge in probability? Justify your answer.
• If $ \mathbf{Y}_{n}\rightarrow\left(p\right)\rightarrow0 $ , then for any $ \epsilon>0 P\left(\left\{ \left|\mathbf{Y}_{n}-0\right|>\epsilon\right\} \right)\rightarrow0 $ as $ n\rightarrow\infty $ .
• This is true if for each $ \epsilon>0 $ and $ \delta>0 $ , there exists an $ n_{0} $ such that $ P\left(\left\{ \left|\mathbf{Y}_{n}-0\right|>\epsilon\right\} \right)<\delta $ for all $ n>n_{0} $ .
• Noting that $ P\left(\left\{ \left|\mathbf{Y}_{n}-0\right|>\epsilon\right\} \right)=P\left(\left\{ \mathbf{Y}_{n}>\epsilon\right\} \right)=e^{-\frac{n\epsilon}{\mu}} $ .
• If we take $ n_{0}=\left\lceil \frac{\mu}{\epsilon}\ln\left(\frac{1}{\delta}\right)\right\rceil $ , this will be true.
• Thus $ \mathbf{Y}_{n}\rightarrow\left(p\right)\rightarrow0 $ .
(c)
Does the sequence of random variables $ \left\{ \mathbf{Y}_{n}\right\} $ converge in the mean-square sense? Justify your answer.
• If $ \mathbf{Y}_{n}\rightarrow\left(m.s.\right)\rightarrow0 $ , then $ E\left[\left|\mathbf{Y}_{n}-0\right|^{2}\right]\rightarrow0 $ as $ n\rightarrow\infty $ .
• This is true if, for every $ \epsilon>0 $ , there exists an $ n_{0} $ such that $ E\left[\left|\mathbf{Y}_{n}-0\right|^{2}\right]<\epsilon $ for $ \forall n>n_{0} $ .
• Noting that $ E\left[\left|\mathbf{Y}_{n}-0\right|^{2}\right]=E\left[\mathbf{Y}_{n}^{2}\right]=\frac{2\mu^{2}}{n^{2}} $ .
• If we take $ n_{0}=\left\lceil \mu\sqrt{\frac{2}{\epsilon}}\right\rceil $ , this is true.
• Thus $ \mathbf{Y}_{n}\rightarrow\left(m.s.\right)\rightarrow0 $ .
4. (14 pts.)
Let $ \mathbf{X}\left(t\right) $ be a zero-mean wide-sense stationary Gaussian white noise process with autocorrelation function $ R_{\mathbf{XX}}\left(\tau\right)=S_{0}\delta\left(\tau\right) $ . Suppose that $ \mathbf{X}\left(t\right) $ is the input to a linear time invariant system with impulse respose $ h\left(t\right)=\frac{\sin\sigma\left(t-b\right)}{\sigma\left(t-b\right)},\quad-\infty<t<\infty, $ where $ b $ is a positive constant. Let $ \mathbf{Y}\left(t\right) $ be the output of the system and assume that the imput has been applied to the system for all time.
(a)
What is the mean of $ \mathbf{Y}\left(t\right) $ ?
$ E\left[\mathbf{Y}\left(t\right)\right]=E\left[\int_{-\infty}^{\infty}h\left(\tau\right)\mathbf{X}\left(t-\tau\right)d\tau\right]=\int_{-\infty}^{\infty}h\left(\tau\right)E\left[\mathbf{X}\left(t-\tau\right)\right]d\tau=\int_{-\infty}^{\infty}h\left(\tau\right)\cdot0d\tau=0. $
(b)
What is the power spectral density of $ \mathbf{Y}\left(t\right) $ ?
$ S_{\mathbf{XX}}\left(\omega\right)=\int_{-\infty}^{\infty}S_{0}\delta\left(\tau\right)e^{-i\omega\tau}d\tau=S_{0}. $
$ H\left(\omega\right)=\int_{-\infty}^{\infty}h\left(t\right)e^{-i\omega t}dt=\int_{0}^{\infty}\frac{\sin\sigma\left(t-b\right)}{\sigma\left(t-b\right)}e^{-i\omega t}dt=\int_{0}^{\infty}\frac{\sin\sigma t}{\sigma t}e^{-i\omega b}dt=e^{-i\omega b}\left(\frac{\pi}{\sigma}\right)\cdot1_{\left[-\sigma,\sigma\right]}\left(\omega\right). $
$ \left|H\left(\omega\right)\right|^{2}=H\left(\omega\right)H^{*}\left(\omega\right)=\left(\frac{\pi}{\sigma}\right)^{2}\cdot1_{\left[-\sigma,\sigma\right]}\left(\omega\right). $
$ S_{\mathbf{YY}}\left(\omega\right)=S_{\mathbf{XX}}\left(\omega\right)\left|H\left(\omega\right)\right|^{2}=S_{0}\cdot\left(\frac{\pi}{\sigma}\right)^{2}\cdot1_{\left[-\sigma,\sigma\right]}\left(\omega\right). $
(c)
What is the autocorrelation function of $ \mathbf{Y}\left(t\right) $ ?
$ R_{\mathbf{YY}}\left(\tau\right)=\frac{1}{2\pi}\int_{-\infty}^{\infty}S_{\mathbf{YY}}\left(\omega\right)e^{i\omega\tau}d\omega=\frac{S_{0}}{2\pi}\left(\frac{\pi}{\sigma}\right)^{2}\int_{-\sigma}^{\sigma}e^{i\omega\tau}d\omega=\frac{S_{0}\pi}{2\sigma^{2}}\cdot\frac{e^{i\tau\sigma}-e^{-i\tau\sigma}}{i\tau} $$ =\frac{S_{0}\pi}{\tau\sigma^{2}}\cdot\frac{e^{i\tau\sigma}-e^{-i\tau\sigma}}{2i}=\frac{S_{0}\pi}{\tau\sigma^{2}}\cdot\sin\tau\sigma=S_{0}\left(\frac{\pi}{\sigma}\right)\cdot\frac{\sin\sigma\tau}{\sigma\tau}. $
(d)
Write an expression for the second-order density $ f_{\mathbf{Y}\left(t_{1}\right)\mathbf{Y}\left(t_{2}\right)}\left(y_{1},y_{2}\right) $ of $ \mathbf{Y}\left(t\right) $ .
Because $ \mathbf{X}\left(t\right) $ is a WSS Gaussian random process and the system is an LTI system, $ \mathbf{Y}\left(t\right) $ is also a WSS Gaussian random process . $ \mathbf{Y}\left(t\right) $ is a WSS Gaussian random process with $ E\left[\mathbf{Y}\left(t\right)\right]=0 $ , $ \sigma_{\mathbf{Y}\left(t\right)}^{2}=R_{\mathbf{YY}}\left(0\right)=S_{0}\left(\frac{\pi}{\sigma}\right) $ .
$ r_{\mathbf{Y}\left(t_{1}\right)\mathbf{Y}\left(t_{2}\right)}=r\left(t_{1}-t_{2}\right)=\frac{R_{\mathbf{YY}}\left(t_{1}-t_{2}\right)}{\sqrt{\sigma_{\mathbf{Y}\left(t_{1}\right)}^{2}\sigma_{\mathbf{Y}\left(t_{2}\right)}^{2}}}=\frac{R_{\mathbf{YY}}\left(t_{1}-t_{2}\right)}{R_{\mathbf{YY}}\left(0\right)}=\frac{\sin\sigma\tau}{\sigma\tau}. $ $ f_{\mathbf{Y}\left(t_{1}\right)\mathbf{Y}\left(t_{2}\right)}\left(y_{1},y_{2}\right)=\frac{1}{2\pi\sigma_{\mathbf{Y}\left(t_{1}\right)}\sigma_{\mathbf{Y}\left(t_{2}\right)}\sqrt{1-r^{2}\left(t_{1}-t_{2}\right)}}\exp\left\{ \frac{-1}{2\left(1-r^{2}\left(t_{1}-t_{2}\right)\right)}\left[\frac{y_{1}^{2}}{\sigma_{\mathbf{Y}\left(t_{1}\right)}^{2}}-\frac{2r\left(t_{1}-t_{2}\right)y_{1}y_{2}}{\sigma_{\mathbf{Y}\left(t_{1}\right)}\sigma_{\mathbf{Y}\left(t_{2}\right)}}+\frac{y_{2}^{2}}{\sigma_{\mathbf{Y}\left(t_{2}\right)}^{2}}\right]\right\} $$ =\frac{1}{2\pi R_{\mathbf{YY}}\left(0\right)\sqrt{1-r^{2}\left(t_{1}-t_{2}\right)}}\exp\left\{ \frac{-1}{2R_{\mathbf{YY}}\left(0\right)\left(1-r^{2}\left(t_{1}-t_{2}\right)\right)}\left[y_{1}^{2}-2r\left(t_{1}-t_{2}\right)y_{1}y_{2}+y_{2}^{2}\right]\right\} . $
$ \therefore \left(\mathbf{Y}\left(t_{1}\right),\mathbf{Y}\left(t_{2}\right)\right) $ has distribution $ N\left[0,0,\sqrt{S_{0}\frac{\pi}{\sigma}},\sqrt{S_{0}\frac{\pi}{\sigma}},\frac{\sin\sigma\left(t_{1}-t_{2}\right)}{\sigma\left(t_{1}-t_{2}\right)}\right] $ .