Contents
ECE600 Homework
From the notes of Sangchun Han, ECE PhD student.
Problem 7-20 (Homework 10)
We place at random n points in the interval $ \left(0,1\right) $ and we denote by $ \mathbf{X} $ and $ \mathbf{Y} $ the distance from the origin to the first and last point respectively. Find $ F\left(x\right) $ , $ F\left(y\right) $ , and $ F\left(x,y\right) $ .
Solution
The event $ \left\{ \mathbf{X}\leq x\right\} $ occurs if at least one point falls in the interval $ \left(0,x\right] $ . The event $ \left\{ \mathbf{Y}\leq y\right\} $ occurs if all of the points fall in the interval $ \left(0,y\right] $ . Let $ A_{x}\triangleq\left\{ \mathbf{X}\leq x\right\} =\left\{ \text{all points fall in }\left(x,1\right)\right\} ^{C} $ and $ B_{y}\triangleq\left\{ \mathbf{Y}\leq y\right\} =\left\{ \text{no points fall in}\left(y,1\right)\right\} $ $ =\left\{ \text{all points fall in }\left(0,y\right)\right\} $ . Hence for $ x\in\left[0,1\right] $ and $ y\in\left[0,1\right] $ , we have $ F_{\mathbf{X}}\left(x\right)=P\left(A_{x}\right)=1-P\left(\bar{A_{x}}\right)=1-\left(1-x\right)^{n} $ and $ F_{\mathbf{Y}}\left(y\right)=P\left(B_{y}\right)=y^{n}. $ Because we know that $ F_{\mathbf{XY}}\left(x,y\right)=P\left(A_{x}\cap B_{y}\right) $ and $ B_{y}=\left(A_{x}\cap B_{y}\right)\cup\left(\bar{A}_{x}\cap B_{y}\right) $ , $ F_{\mathbf{XY}}\left(x,y\right)=P\left(A_{x}\cap B_{y}\right)=P\left(B_{y}\right)-P\left(\bar{A}_{x}\cap B_{y}\right). $ Now if $ x\leq y $ , then $ \bar{A}_{x}\cap B_{y}=\left\{ \text{all points in interval }\left(x,y\right]\right\} $ and $ P\left(\bar{A}_{x}\cap B_{y}\right)=\left(y-x\right)^{n} $ . If x>y , then $ \bar{A}_{x}\cap B_{y}=\varnothing $ and $ P\left(\bar{A}_{x}\cap B_{y}\right)=0 $ . Thus, $ F_{\mathbf{XY}}\left(x,y\right)=P\left(A_{x}\cap B_{y}\right)=P\left(B_{y}\right)-P\left(\bar{A}_{x}\cap B_{y}\right)=\left\{ \begin{array}{lll} y^{n}-\left(y-x\right)^{n} & , & x\leq y\\ y^{n} & , & x>y. \end{array}\right. $
Problem 7-25 (Homework 10)
Show that if $ a_{n}\rightarrow $ a and $ E\left\{ \left|\mathbf{X}_{n}-a_{n}\right|^{2}\right\} \rightarrow0 $, then $ \mathbf{X}_{n}\rightarrow a $ in the MS sense as $ n\rightarrow\infty $ .
Solution
• Definition of m.s. convergence
We say that a random sequence converges in mean-square (m.s.) to a random variable $ \mathbf{X} $ if $ E\left[\left|\mathbf{X}_{n}-\mathbf{X}\right|^{2}\right]\rightarrow0\text{ as }n\rightarrow\infty. $
• Expand $ E\left\{ \left(\mathbf{X}_{n}-a\right)^{2}\right\} $ using $ a_{n} E\left\{ \left(\mathbf{X}_{n}-a\right)^{2}\right\} $
• SubstitutionNow as $ n\rightarrow\infty $ , we are given 1) $ a_{n}\rightarrow a $ , 2) $ E\left\{ \left|\mathbf{X}_{n}-a_{n}\right|^{2}\right\} \rightarrow0 $ .
• Thus, we have $ \lim_{n\rightarrow\infty}E\left\{ \left(\mathbf{X}_{n}-a\right)^{2}\right\} $ $ \therefore E\left\{ \left(\mathbf{X}_{n}-a\right)^{2}\right\} \rightarrow0\text{ as }n\rightarrow\infty. $
Problem 7-27 (Homework 10)
An infinite sum is by definition a limit: $ \sum_{k=1}^{\infty}\mathbf{X}_{k}=\lim_{n\rightarrow\infty}\mathbf{Y}_{n}\text{ where }\mathbf{Y}_{n}=\sum_{k=1}^{n}\mathbf{X}_{k} $ Show that if the random variables $ \mathbf{X}_{k} $ are independent with zero mean and variance $ \sigma_{k}^{2} $ , then the sum exists in the MS sense iff $ \sum_{k=1}^{\infty}\sigma_{k}^{2}<\infty. $
Hint:
$ E\left\{ \left(\mathbf{Y}_{n+m}-\mathbf{Y}_{n}\right)^{2}\right\} =\sum_{k=n+1}^{n+m}\sigma_{k}^{2}. $
Solution
We say that a random sequence converges in mean-square (m.s.) to a random variable $ \mathbf{X} $ if $ E\left[\left|\mathbf{X}_{n}-\mathbf{X}\right|^{2}\right]\rightarrow0\text{ as }n\rightarrow\infty. $