(No difference)

Latest revision as of 23:54, 9 March 2015


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2003



4. (25% of Total)

Let $ \mathbf{X}_{n},\; n=1,2,\cdots $ , be a zero mean, discrete-time, white noise process with $ E\left(\mathbf{X}_{n}^{2}\right)=1 $ for all $ n $ . Let $ \mathbf{Y}_{0} $ be a random variable that is independent of the sequence $ \left\{ \mathbf{X}_{n}\right\} $ , has mean $ 0 $ , and has variance $ \sigma^{2} $ . Define $ \mathbf{Y}_{n},\; n=1,2,\cdots $ , to be an autoregressive process as follows: $ \mathbf{Y}_{n}=\frac{1}{3}\mathbf{Y}_{n-1}+\mathbf{X}_{n}. $

a. (20 %)

Show that $ \mathbf{Y}_{n} $ is asymptotically wide sense stationary and find its steady state mean and autocorrelation function.

$ \mathbf{Y}_{n}=\frac{1}{3}\mathbf{Y}_{n-1}+\mathbf{X}_{n}=\frac{1}{3}\left(\frac{1}{3}\mathbf{Y}_{n-2}+\mathbf{X}_{n-1}\right)+\mathbf{X}_{n}=\left(\frac{1}{3}\right)^{2}\mathbf{Y}_{n-2}+\mathbf{X}_{n}+\frac{1}{3}\mathbf{X}_{n-1} $$ =\cdots=\left(\frac{1}{3}\right)^{n}\mathbf{Y}_{0}+\sum_{k=0}^{n-1}\left(\frac{1}{3}\right)^{k}\mathbf{X}_{n-k}. $

$ E\left[\mathbf{Y}_{n}\right]=E\left[\left(\frac{1}{3}\right)^{n}\mathbf{Y}_{0}+\sum_{k=0}^{n-1}\left(\frac{1}{3}\right)^{k}\mathbf{X}_{n-k}\right]=\left(\frac{1}{3}\right)^{n}E\left[\mathbf{Y}_{0}\right]+\sum_{k=0}^{n-1}\left(\frac{1}{3}\right)^{k}E\left[\mathbf{X}_{n-k}\right] $$ =\left(\frac{1}{3}\right)^{n}\cdot0+\sum_{k=0}^{n-1}\left(\frac{1}{3}\right)^{k}\cdot0=0. $

$ E\left[\mathbf{Y}_{m}\mathbf{Y}_{n}\right]=\left(\frac{1}{3}\right)^{m+n}E\left[\mathbf{Y}_{0}^{2}\right]+\left(\frac{1}{3}\right)^{m}E\left[\mathbf{Y}_{0}\sum_{k=0}^{n-1}\left(\frac{1}{3}\right)^{k}\mathbf{X}_{n-k}\right] $$ \qquad+\left(\frac{1}{3}\right)^{n}E\left[\mathbf{Y}_{0}\sum_{k=0}^{m-1}\left(\frac{1}{3}\right)^{k}\mathbf{X}_{n-k}\right]+\sum_{i=0}^{m-1}\sum_{j=0}^{n-1}\left(\frac{1}{3}\right)^{i+j}E\left[\mathbf{X}_{m-i}\cdot\mathbf{X}_{n-j}\right] $$ =\left(\frac{1}{3}\right)^{m+n}\cdot\left(\sigma^{2}+0^{2}\right)+\sum_{k=1}^{\min\left(m,n\right)}\left(\frac{1}{3}\right)^{m+n-2k} $$ =\left(\frac{1}{3}\right)^{m+n}\cdot\sigma^{2}+\sum_{k=1}^{\min\left(m,n\right)}\left(\frac{1}{3}\right)^{m+n-2k}. $

$ \because\;\sum_{i=0}^{m-1}\sum_{j=0}^{n-1}\left(\frac{1}{3}\right)^{i+j}E\left[\mathbf{X}_{m-i}\cdot\mathbf{X}_{n-j}\right]=\sum_{i=1}^{m}\sum_{j=1}^{n}\left(\frac{1}{3}\right)^{m-i}\left(\frac{1}{3}\right)^{n-j}E\left[\mathbf{X}_{i}\cdot\mathbf{X}_{j}\right]=\sum_{k=1}^{\min\left(m,n\right)}\left(\frac{1}{3}\right)^{m+n-2k}. $

b. (5%)

For what choice of $ \sigma^{2} $ is the process wide sense stationary; i.e., not just asymptotically wide sense stationary?

Alumni Liaison

Followed her dream after having raised her family.

Ruth Enoch, PhD Mathematics