(2 intermediate revisions by the same user not shown)
Line 5: Line 5:
 
*[[ECE 600 Exams Addition of two independent Poisson random variables|Addition of two independent Poisson random variables]]
 
*[[ECE 600 Exams Addition of two independent Poisson random variables|Addition of two independent Poisson random variables]]
 
*[[ECE 600 Exams Addition of two independent Gaussian random variables|Addition of two independent Gaussian random variables]]
 
*[[ECE 600 Exams Addition of two independent Gaussian random variables|Addition of two independent Gaussian random variables]]
*[[ECE 600 Exams Addition of two jointly distributed Gaussian random variables|Addition of two jointly distributed Gaussian random variables]]
 
 
*[[ECE 600 Exams Addition of two jointly distributed Gaussian random variables|Addition of two jointly distributed Gaussian random variables]]
 
*[[ECE 600 Exams Addition of two jointly distributed Gaussian random variables|Addition of two jointly distributed Gaussian random variables]]
 
*[[ECE 600 Exams Two jointly distributed random variables|Two jointly distributed random variables]]
 
*[[ECE 600 Exams Two jointly distributed random variables|Two jointly distributed random variables]]
Line 17: Line 16:
 
*[[ECE 600 Exams A sum of a random number of iid Gaussian random variables|A sum of a random number of iid Gaussian random variables]]
 
*[[ECE 600 Exams A sum of a random number of iid Gaussian random variables|A sum of a random number of iid Gaussian random variables]]
  
+
----
 
+
[[ECE600|Back to ECE600]]
=Example. Addition of two independent Poisson random variables=
+
 
+
Let <math>\mathbf{Z}=\mathbf{X}+\mathbf{Y}</math>  where <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  are independent Poisson random variables with means <math>\lambda</math>  and <math>\mu</math> , respectively.
+
 
+
(a)
+
 
+
Find the pmf of <math>\mathbf{Z}</math> .
+
 
+
According to the characteristic function of Poisson random variable
+
 
+
<math>\Phi_{\mathbf{X}}(\omega)=e^{-\lambda\left(1-e^{i\omega}\right)},\Phi_{\mathbf{Y}}(\omega)=e^{-\mu\left(1-e^{i\omega}\right)}</math>.
+
 
+
<math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  are independent <math>\Longrightarrow  \mathbf{X}</math>  and <math>\mathbf{Y}</math>  are uncorrelated <math>\Longrightarrow  e^{i\omega\mathbf{X}}</math>  and <math>e^{i\omega\mathbf{Y}}</math>  are uncorrelated.
+
 
+
<math>\Phi_{\mathbf{Z}}(\omega)=E\left[e^{i\omega\mathbf{Z}}\right]=E\left[e^{i\omega\left(\mathbf{X}+\mathbf{Y}\right)}\right]=E\left[e^{i\omega\mathbf{X}}e^{i\omega\mathbf{Y}}\right]=E\left[e^{i\omega\mathbf{X}}\right]\cdot E\left[e^{i\omega\mathbf{Y}}\right]</math><math>=e^{-\lambda\left(1-e^{i\omega}\right)}\cdot e^{-\mu\left(1-e^{i\omega}\right)}=e^{-\left(\lambda+\mu\right)\left(1-e^{i\omega}\right).}</math>
+
 
+
Now, we know that \mathbf{Z}  is a Poisson random variable with mean <math>\lambda+\mu</math> .
+
 
+
<math>\therefore p_{\mathbf{Z}}(k)=\frac{e^{-\left(\lambda+\mu\right)}\left(\lambda+\mu\right)^{k}}{k!}.</math>
+
 
+
(b)
+
 
+
Show that the conditional pmf of <math>\mathbf{X}</math>  conditioned on the event <math>\left\{ \mathbf{Z}=n\right\}</math>  is binomially distributed, and determine the parameters of binomial distribution (<math>n</math>  and <math>p</math> ).
+
 
+
<math>P_{\mathbf{X}}\left(\mathbf{X}|\left\{ \mathbf{Z}=n\right\} \right)=P\left(\left\{ \mathbf{X}=k\right\} |\left\{ \mathbf{Z}=n\right\} \right)=\frac{P\left(\left\{ \mathbf{X}=k\right\} \cap\left\{ \mathbf{Z}=n\right\} \right)}{P\left(\left\{ \mathbf{Z}=n\right\} \right)}=\frac{P\left(\left\{ \mathbf{X}=k\right\} \cap\left\{ \mathbf{Y}=n-k\right\} \right)}{P\left(\left\{ \mathbf{Z}=n\right\} \right)}</math><math>=\frac{\frac{e^{-\lambda}\lambda^{k}}{k!}\cdot\frac{e^{-\mu}\mu^{n-k}}{\left(n-k\right)!}}{\frac{e^{-\left(\lambda+\mu\right)}\left(\lambda+\mu\right)^{n}}{n!}}=\left(\frac{n!}{k!\left(n-k\right)!}\right)\left(\frac{\lambda}{\lambda+\mu}\right)^{k}\left(\frac{\mu}{\lambda+\mu}\right)^{n-k}</math><math>=\left(\begin{array}{c}
+
n\\
+
k
+
\end{array}\right)\left(\frac{\lambda}{\lambda+\mu}\right)^{k}\left(\frac{\mu}{\lambda+\mu}\right)^{n-k}\;,\; k=0,\,1,\,2,\,\cdots</math>
+
 
+
This is a binomial pmf <math>b(n,p)</math>  with parameters <math>n</math>  and <math>p=\frac{\lambda}{\lambda+\mu}</math> .
+
 
+
=Example. Addition of two independent Gaussian random variables=
+
 
+
<math>\mathbf{X}\sim\mathcal{N}\left(0,\sigma_{\mathbf{X}}^{2}\right),\;\mathbf{N}\sim\mathcal{N}\left(0,\sigma_{\mathbf{N}}^{2}\right),\;\mathbf{Y}=\mathbf{X}+\mathbf{N}.</math>
+
 
+
(a)
+
 
+
Correlation coefficient between <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math> .
+
 
+
<math>\sigma_{\mathbf{Y}}=\sqrt{\sigma_{\mathbf{X}}^{2}+2r_{\mathbf{XN}}\sigma_{\mathbf{X}}\sigma_{\mathbf{N}}+\sigma_{\mathbf{N}}^{2}}=\sqrt{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}</math>
+
 
+
because <math>\mathbf{X}</math>  and <math>\mathbf{N}</math>  are independnet <math>\Longrightarrow</math> uncorrelated <math>\Longrightarrow r_{\mathbf{XN}}=0</math> .
+
 
+
<math>r_{\mathbf{XY}}=\frac{\text{cov}(\mathbf{X},\mathbf{Y})}{\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}}=\frac{E\left[\mathbf{XY}\right]-E\left[\mathbf{X}\right]E\left[\mathbf{Y}\right]}{\sigma_{\mathbf{X}}\sqrt{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}}=\frac{E\left[\mathbf{X}\left(\mathbf{X}+\mathbf{N}\right)\right]}{\sigma_{\mathbf{X}}\sqrt{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}}=\frac{E\left[\mathbf{X}^{2}\right]+E\left[\mathbf{XN}\right]}{\sigma_{\mathbf{X}}\sqrt{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}}</math><math>=\frac{\sigma_{\mathbf{X}}^{2}+E\left[\mathbf{X}\right]E\left[\mathbf{N}\right]}{\sigma_{\mathbf{X}}\sqrt{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}}=\frac{\sigma_{\mathbf{X}}}{\sqrt{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}}\qquad\because E\left[\mathbf{X}\right]=0.</math>
+
 
+
(b)
+
 
+
Conditional pmf of <math>\mathbf{X}</math>  conditioned on the event <math>\left\{ \mathbf{Y}=y\right\}</math>  .
+
 
+
<math>f_{\mathbf{X}}\left(x|\left\{ \mathbf{Y}=y\right\} \right)=\frac{f_{\mathbf{XY}}\left(x,y\right)}{f_{\mathbf{Y}}(y)}=\frac{\frac{1}{2\pi\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}\sqrt{1-r^{2}}}\exp\left\{ \frac{-1}{2\left(1-r^{2}\right)}\left[\frac{x^{2}}{\sigma_{\mathbf{X}}^{2}}-\frac{2rxy}{\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}}+\frac{y^{2}}{\sigma_{\mathbf{Y}}^{2}}\right]\right\} }{\frac{1}{\sqrt{2\pi}\sigma_{Y}}\exp\left\{ \frac{-y^{2}}{2\sigma_{Y}^{2}}\right\} }</math><math>=\frac{1}{\sqrt{2\pi}\sigma_{\mathbf{X}}\sqrt{1-r^{2}}}\exp\left\{ \frac{-1}{2\left(1-r^{2}\right)}\left[\frac{x^{2}}{\sigma_{\mathbf{X}}^{2}}-\frac{2rxy}{\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}}+\frac{y^{2}}{\sigma_{\mathbf{Y}}^{2}}-\frac{\left(1-r^{2}\right)y^{2}}{\sigma_{\mathbf{Y}}^{2}}\right]\right\}</math> <math>=\frac{1}{\sqrt{2\pi}\sigma_{\mathbf{X}}\sqrt{1-r^{2}}}\exp\left\{ \frac{-1}{2\left(1-r^{2}\right)}\left[\frac{x^{2}}{\sigma_{\mathbf{X}}^{2}}-\frac{2rxy}{\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}}+\frac{r^{2}y^{2}}{\sigma_{\mathbf{Y}}^{2}}\right]\right\}</math> <math>=\frac{1}{\sqrt{2\pi}\sigma_{\mathbf{X}}\sqrt{1-r^{2}}}\exp\left\{ \frac{-1}{2\left(1-r^{2}\right)\sigma_{\mathbf{X}}^{2}}\left[x^{2}-\frac{2r\sigma_{\mathbf{X}}xy}{\sigma_{\mathbf{Y}}}+\frac{r^{2}\sigma_{\mathbf{X}}^{2}y^{2}}{\sigma_{\mathbf{Y}}^{2}}\right]\right\}</math> <math>=\frac{1}{\sqrt{2\pi}\sigma_{\mathbf{X}}\sqrt{1-r^{2}}}\exp\left\{ \frac{-1}{2\left(1-r^{2}\right)\sigma_{\mathbf{X}}^{2}}\left(x-\frac{r\sigma_{\mathbf{X}}y}{\sigma_{\mathbf{Y}}}\right)^{2}\right\}</math>
+
 
+
Noting that <math>\sqrt{1-r^{2}}=\sigma_{\mathbf{X}}\sqrt{1-\left(\frac{\sigma_{\mathbf{X}}}{\sqrt{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{Y}}^{2}}}\right)^{2}}=\sqrt{1-\frac{\sigma_{\mathbf{X}}^{2}}{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{Y}}^{2}}}=\sqrt{\frac{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}-\sigma_{\mathbf{N}}^{2}}{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}}=\frac{\sigma_{\mathbf{N}}}{\sqrt{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}}</math>  and
+
 
+
<math>r\cdot\frac{\sigma_{\mathbf{X}}}{\sigma_{\mathbf{Y}}}=\frac{\sigma_{X}}{\sqrt{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}}\cdot\frac{\sigma_{\mathbf{X}}}{\sqrt{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}}=\frac{\sigma_{\mathbf{X}}^{2}}{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}.</math> <math>\therefore f_{\mathbf{X}}\left(x|\left\{ \mathbf{Y}=y\right\} \right)=\frac{1}{\sqrt{2\pi}\cdot\frac{\sigma_{\mathbf{X}}\sigma_{\mathbf{N}}}{\sqrt{\sigma_{\mathbf{X}}^{2}+\mathbf{\sigma}_{\mathbf{N}}^{\mathbf{2}}}}}\exp\left\{ \frac{-1}{2\frac{\sigma_{\mathbf{X}}^{2}\sigma_{\mathbf{N}}^{2}}{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}}\left(x-\frac{\sigma_{\mathbf{X}}^{2}}{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}\cdot y\right)^{2}\right\}</math> 
+
 
+
(c)
+
 
+
What kind of pdf is the pdf you determined in part (b)? What is the mean and variance of a random variable with this pdf?
+
 
+
This is a Gaussian pdf with mean <math>\frac{\sigma_{\mathbf{X}}^{2}}{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}\cdot y</math>  and variance <math>\frac{\sigma_{\mathbf{X}}^{2}\sigma_{\mathbf{N}}^{2}}{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{N}}^{2}}</math> .
+
 
+
(d)
+
 
+
What is the minimum mean-square estimate of <math>\mathbf{X}</math>  given that <math>\left\{ \mathbf{Y}=y\right\}</math>  ?
+
 
+
The minimum mean-square error estimate of <math>\mathbf{X}</math>  given <math>\mathbf{Y}=y</math>  is
+
 
+
<math>\hat{x}_{MMS}(y)=E\left[\mathbf{X}|\left\{ \mathbf{Y}=y\right\} \right]=\int_{-\infty}^{\infty}x\cdot f_{\mathbf{X}}\left(x|\left\{ \mathbf{Y}=y\right\} \right)dx=\frac{\sigma_{\mathbf{X}}^{2}}{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{Y}}^{2}}\cdot y</math>  from part (b).
+
 
+
(e)
+
 
+
What is the maximum a posteriori estimate of <math>\mathbf{X}</math>  given that <math>\left\{ \mathbf{Y}=y\right\}</math>  ?
+
 
+
<math>\hat{x}_{MAP}(y)=\arg\max_{x\in\mathbf{R}}\left\{ f_{\mathbf{X}}\left(x|\left\{ Y=y\right\} \right)\right\} =\frac{\sigma_{\mathbf{X}}^{2}}{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{Y}}^{2}}\cdot y</math>
+
 
+
as a Gaussian pdf takes on its maximum value at its mean.
+
 
+
(f)
+
 
+
Given that I observe <math>\mathbf{Y}=y</math> , what is <math>E\left[\mathbf{X}|\left\{ \mathbf{Y}=y\right\} \right]</math> ?
+
 
+
<math>E\left[\mathbf{X}|\left\{ \mathbf{Y}=y\right\} \right]=\frac{\sigma_{\mathbf{X}}^{2}}{\sigma_{\mathbf{X}}^{2}+\sigma_{\mathbf{Y}}^{2}}\cdot y</math> from part (d).
+
 
+
=Example. Addition of two jointly distributed Gaussian random variables=
+
 
+
Let <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  be two jointly distributed Gaussian random variables. Assume <math>\mathbf{X}</math>  has mean <math>\mu_{\mathbf{X}}</math>  and variance <math>\sigma_{\mathbf{X}}^{2} , \mathbf{Y}</math>  has mean <math>\mu_{\mathbf{Y}}</math>  and variance <math>\sigma_{\mathbf{Y}}^{2}</math> , and that the correlation coefficient between <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  is <math>r</math> . Define a new random variable <math>\mathbf{Z}=\mathbf{X}+\mathbf{Y}</math> .
+
 
+
(a)
+
 
+
Show that <math>\mathbf{Z}</math>  is a Gaussian random variable.
+
 
+
If <math>\mathbf{Z}</math>  is a Guassian random variable, then it has a characteristic function of the form
+
 
+
<math>\Phi_{\mathbf{Z}}\left(\omega\right)=e^{i\mu_{\mathbf{Z}}\omega}e^{-\frac{1}{2}\sigma_{\mathbf{Z}}^{2}\omega^{2}}.</math>
+
 
+
<math>\Phi_{\mathbf{Z}}\left(\omega\right)</math>
+
 
+
where <math>\Phi_{\mathbf{XY}}\left(\omega_{1},\omega_{2}\right)</math>  is the joint characteristic function of <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math> , defined as
+
 
+
<math>\Phi_{\mathbf{XY}}\left(\omega_{1},\omega_{2}\right)=E\left[e^{i\left(\mathbf{\omega_{1}X}+\omega_{2}\mathbf{Y}\right)}\right].</math>
+
 
+
Now because <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  are jointly Gaussian with the given parameters, we know that
+
 
+
<math>\Phi_{\mathbf{XY}}\left(\omega_{1},\omega_{2}\right)=e^{i\left(\mu_{X}\omega_{1}+\mu_{Y}\omega_{2}\right)}e^{-\frac{1}{2}\left(\sigma_{X}^{2}\omega_{1}^{2}+2r\sigma_{X}\sigma_{Y}\omega_{1}\omega_{2}+\sigma_{Y}^{2}\omega_{2}^{2}\right)}.</math>
+
 
+
Thus,
+
 
+
<math>\Phi_{\mathbf{Z}}\left(\omega\right)=\Phi_{\mathbf{XY}}\left(\omega,\omega\right)=e^{i\left(\mu_{X}\omega+\mu_{Y}\omega\right)}e^{-\frac{1}{2}\left(\sigma_{X}^{2}\omega^{2}+2r\sigma_{X}\sigma_{Y}\omega^{2}+\sigma_{Y}^{2}\omega^{2}\right)}</math><math>=e^{i\left(\mu_{X}+\mu_{Y}\right)\omega}e^{-\frac{1}{2}\left(\sigma_{X}^{2}+2r\sigma_{X}\sigma_{Y}+\sigma_{Y}^{2}\right)\omega^{2}}=e^{i\mu_{Z}\omega}e^{-\frac{1}{2}\sigma_{Z}^{2}\omega^{2}}</math>
+
 
+
where <math>\mu_{Z}=\mu_{X}+\mu_{Y}</math>  and <math>\sigma_{Z}^{2}=\sigma_{X}^{2}+2r\sigma_{X}\sigma_{Y}+\sigma_{Y}^{2}</math> .
+
 
+
<math>\mathbf{Z}</math>  is a Gaussian random variable with <math>E\left[\mathbf{Z}\right]=\mu_{X}+\mu_{Y}  and Var\left[\mathbf{Z}\right]=\sigma_{X}^{2}+2r\sigma_{X}\sigma_{Y}+\sigma_{Y}^{2}</math> .
+
 
+
(b)
+
 
+
Find the variance of <math>\mathbf{Z}</math> .
+
 
+
As show in part (a) <math>Var\left[\mathbf{Z}\right]=\sigma_{\mathbf{X}}^{2}+2r\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}+\sigma_{\mathbf{Y}}^{2}</math> .
+
 
+
=Example. Two jointly distributed random variables=
+
 
+
Two joinly distributed random variables <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  have joint pdf
+
 
+
<math>f_{\mathbf{XY}}\left(x,y\right)=\begin{cases}
+
\begin{array}{ll}
+
c  ,\text{ for }x\geq0,y\geq0,\textrm{ and }x+y\leq1\\
+
0  ,\text{ elsewhere.}
+
\end{array}\end{cases}</math>
+
 
+
(a)
+
 
+
Find the constant c  such that <math>f_{\mathbf{XY}}(x,y)</math>  is a valid pdf.
+
 
+
[[Image:002.eps]]
+
 
+
<math>\iint_{\mathbf{R}^{2}}f_{\mathbf{XY}}\left(x,y\right)=c\cdot Area=1</math>  where <math>Area=\frac{1}{2}</math> .
+
 
+
<math>\therefore c=2</math>
+
 
+
(b)
+
 
+
Find the conditional density of <math>\mathbf{Y}</math>  conditioned on <math>\mathbf{X}=x</math> .
+
 
+
<math>f_{\mathbf{Y}}\left(y|\left\{ \mathbf{X}=x\right\} \right)=\frac{f_{\mathbf{XY}}\left(x,y\right)}{f_{\mathbf{X}}(x)}.</math>
+
 
+
<math>f_{\mathbf{X}}(x)=\int_{-\infty}^{\infty}f_{\mathbf{XY}}\left(x,y\right)dy=\int_{0}^{1-x}2dy=2\left(1-x\right)\cdot\mathbf{1}_{\left[0,1\right]}(x).</math>
+
 
+
<math>f_{\mathbf{Y}}\left(y|\left\{ \mathbf{X}=x\right\} \right)=\frac{f_{\mathbf{XY}}\left(x,y\right)}{f_{\mathbf{X}}(x)}=\frac{2}{2\left(1-x\right)}=\frac{1}{1-x}\textrm{ where }0\leq y\leq1-x\Longrightarrow\frac{1}{1-x}\cdot\mathbf{1}_{\left[0,1-x\right]}\left(y\right).</math>
+
 
+
(c)
+
 
+
Find the minimum mean-square error estimator <math>\hat{y}_{MMS}\left(x\right)</math>  of <math>\mathbf{Y}</math>  given that <math>\mathbf{X}=x</math> .
+
 
+
<math>\hat{y}_{MMS}\left(x\right)=E\left[\mathbf{Y}|\left\{ \mathbf{X}=x\right\} \right]=\int_{\mathbf{R}}yf_{\mathbf{Y}}\left(y|\left\{ \mathbf{X}=x\right\} \right)dy=\int_{0}^{1-x}\frac{y}{1-x}dy=\frac{y^{2}}{2\left(1-x\right)}\biggl|_{0}^{1-x}=\frac{1-x}{2}.</math>
+
 
+
(d)
+
 
+
Find a maximum aposteriori probability estimator.
+
 
+
<math>\hat{y}_{MAP}\left(x\right)=\arg\max_{y}\left\{ f_{Y}\left(y|\left\{ \mathbf{X}=x\right\} \right)\right\}</math>  but <math>f_{Y}\left(y|\left\{ \mathbf{X}=x\right\} \right)=\frac{1}{1-x}\cdot\mathbf{1}_{\left[0,1-x\right]}\left(y\right)</math> . Any <math>\hat{y}\in\left[0,1-x\right]</math>  is a MAP estimator. The MAP estimator is NOT unique.
+
 
+
=Example. Two jointly distributed independent random variables=
+
 
+
Let <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  be two jointly distributed, independent random variables. The pdf of <math>\mathbf{X}</math>  is
+
 
+
<math>f_{\mathbf{X}}\left(x\right)=xe^{-x^{2}/2}\cdot\mathbf{1}_{\left[0,\infty\right)}\left(x\right)</math>, and <math>\mathbf{Y}</math>  is a Gaussian random variable with mean 0  and variance 1 . Let <math>\mathbf{U}</math>  and <math>\mathbf{V}</math>  be two new random variables defined as <math>\mathbf{U}=\sqrt{\mathbf{X}^{2}+\mathbf{Y}^{2}}</math>  and <math>\mathbf{V}=\lambda\mathbf{Y}/\mathbf{X}</math>  where <math>\lambda</math>  is a positive real number.
+
 
+
(a)
+
 
+
Find the joint pdf of <math>\mathbf{U}</math>  and <math>\mathbf{V}</math> . (Direct pdf method)
+
 
+
<math>f_{\mathbf{UV}}\left(u,v\right)=f_{\mathbf{XY}}\left(x\left(u,v\right),y\left(u,v\right)\right)\left|\frac{\partial\left(x,y\right)}{\partial\left(u,v\right)}\right|</math>
+
 
+
Solving for x  and y  in terms of u  and v , we have <math>u^{2}=x^{2}+y^{2}</math>  and <math>v^{2}=\frac{\lambda^{2}y^{2}}{x^{2}}\Longrightarrow y^{2}=\frac{v^{2}x^{2}}{\lambda^{2}}</math> .
+
 
+
Now, <math>u^{2}=x^{2}+y^{2}=x^{2}+\frac{v^{2}x^{2}}{\lambda^{2}}=x^{2}\left(1+v^{2}/\lambda^{2}\right)\Longrightarrow x=\frac{u}{\sqrt{1+v^{2}/\lambda^{2}}}\Longrightarrow x\left(u,v\right)=\frac{u}{\sqrt{1+v^{2}/\lambda^{2}}}</math> .
+
 
+
Thus, <math>y=\frac{vx}{\lambda}=\frac{vu}{\lambda\sqrt{1+v^{2}/\lambda^{2}}}\Longrightarrow y\left(u,v\right)=\frac{vu}{\lambda\sqrt{1+v^{2}/\lambda^{2}}}</math> .
+
 
+
Computing the Jacobian.
+
 
+
<math>\frac{\partial\left(x,y\right)}{\partial\left(u,v\right)} =\left|\begin{array}{ll}
+
\frac{\partial x}{\partial u}  \frac{\partial x}{\partial v}\\
+
\frac{\partial y}{\partial u}  \frac{\partial y}{\partial v}
+
\end{array}\right|=\left|\begin{array}{cc}
+
\frac{1}{\sqrt{1+v^{2}/\lambda^{2}}}  \frac{-uv}{\lambda^{2}\left(1-v^{2}/\lambda^{2}\right)^{\frac{3}{2}}}\\
+
\frac{v}{\lambda\sqrt{1+v^{2}/\lambda^{2}}}  \frac{u}{\lambda\sqrt{1+v^{2}/\lambda^{2}}}-\frac{uv^{2}}{\lambda^{3}\left(1+v^{2}/\lambda^{2}\right)^{\frac{3}{2}}}
+
\end{array}\right|</math><math>=\frac{1}{\sqrt{1+v^{2}/\lambda^{2}}}\left[\frac{u}{\lambda\sqrt{1+v^{2}/\lambda^{2}}}-\frac{uv^{2}}{\lambda^{3}\left(1+v^{2}/\lambda^{2}\right)^{\frac{3}{2}}}\right]-\frac{-uv^{2}}{\lambda^{3}\left(1-v^{2}/\lambda^{2}\right)^{2}}</math><math>=\frac{u}{\lambda\left(1+v^{2}/\lambda^{2}\right)}=\frac{\lambda u}{\lambda^{2}+v^{2}}\qquad\left(\geq0\text{ because u is non-negative}\right).</math>
+
 
+
Because <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  are statistically independent
+
 
+
<math>f_{\mathbf{XY}}\left(x,y\right)=f_{\mathbf{X}}\left(x\right)f_{\mathbf{Y}}\left(y\right)=xe^{-x^{2}/2}\cdot\mathbf{1}_{\left[0,\infty\right)}\left(x\right)\cdot\frac{1}{\sqrt{2\pi}}e^{-y^{2}/2}=\frac{x}{\sqrt{2\pi}}e^{-\left(x^{2}+y^{2}\right)/2}\cdot\mathbf{1}_{\left[0,\infty\right)}\left(x\right).</math>
+
 
+
Substituting these quantities, we get
+
 
+
<math>f_{\mathbf{UV}}\left(u,v\right)=f_{\mathbf{XY}}\left(x\left(u,v\right),y\left(u,v\right)\right)\left|\frac{\partial\left(x,y\right)}{\partial\left(u,v\right)}\right|=\frac{u}{\sqrt{1+v^{2}/\lambda^{2}}}\cdot\frac{1}{\sqrt{2\pi}}e^{-u^{2}/2}\cdot\frac{\lambda u}{\lambda^{2}+v^{2}}\cdot\mathbf{1}_{\left[0,\infty\right)}\left(u\right)</math><math>=\frac{\lambda^{2}}{\sqrt{2\pi}}u^{2}e^{-u^{2}/2}\cdot\mathbf{1}_{\left[0,\infty\right)}\left(u\right)\cdot\frac{1}{\left(\lambda^{2}+v^{2}\right)^{\frac{3}{2}}}.</math>
+
 
+
(b)
+
 
+
Are <math>\mathbf{U}</math>  and <math>\mathbf{V}</math>  statistically independent? Justify your answer.
+
 
+
<math>\mathbf{U}</math>  and <math>\mathbf{V}</math>  are statistically independent iff <math>f_{\mathbf{UV}}\left(u,v\right)=f_{\mathbf{U}}\left(u\right)f_{\mathbf{V}}\left(v\right)</math> .
+
 
+
Now from part (a), we see that <math>f_{\mathbf{UV}}\left(u,v\right)=c_{1}g_{1}\left(u\right)\cdot c_{2}g_{2}\left(v\right)</math> where <math>g_{1}\left(u\right)=u^{2}e^{-u^{2}/2}\cdot\mathbf{1}_{\left[0,\infty\right)}\left(u\right)</math>  and <math>g_{2}\left(v\right)=\frac{1}{\left(\lambda^{2}+v^{2}\right)^{\frac{3}{2}}}</math>  with <math>c_{1}</math>  and <math>c_{2}</math>  selected such that <math>f_{\mathbf{U}}\left(u\right)=c_{1}g_{1}\left(u\right)</math>  and <math>f_{\mathbf{V}}\left(v\right)=c_{2}g_{2}\left(v\right)</math>  are both valid pdfs.
+
 
+
<math>\therefore  \mathbf{U}</math>  and <math>\mathbf{V}</math>  are statistically independent.
+
 
+
=Example. Two jointly distributed random variables (Joint characteristic function)=
+
 
+
Let <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  be tweo jointly distributed random variables having joint characteristic function
+
 
+
<math>\Phi_{\mathbf{XY}}\left(\omega_{1},\omega_{2}\right)=\frac{1}{\left(1-i\omega_{1}\right)\left(1-i\omega_{2}\right)}.</math>
+
 
+
(a) Calculate <math>E\left[\mathbf{X}\right]</math> .
+
 
+
<math>\Phi_{\mathbf{X}}\left(\omega\right)=\Phi_{\mathbf{XY}}\left(\omega,0\right)=\frac{1}{1-i\omega}=\left(1-i\omega\right)^{-1}</math>
+
 
+
<math>E\left[\mathbf{X}\right]=\frac{d}{d\left(i\omega\right)}\Phi_{\mathbf{X}}\left(\omega\right)|_{i\omega=0}=(-1)(1-i\omega)^{-2}(-1)|_{i\omega=0}=1</math>
+
 
+
(b) Calculate <math>E\left[\mathbf{Y}\right]</math>
+
 
+
<math>E\left[\mathbf{Y}\right]=1</math>
+
 
+
(c) Calculate <math>E\left[\mathbf{XY}\right]</math> .
+
 
+
<math>E\left[\mathbf{XY}\right]=\frac{\partial^{2}}{\partial\left(i\omega_{1}\right)\partial\left(i\omega_{2}\right)}\Phi_{\mathbf{XY}}\left(\omega_{1},\omega_{2}\right)|_{i\omega_{1}=i\omega_{2}=0}=\left(1-i\omega_{1}\right)^{-2}\left(1-i\omega_{2}\right)^{-2}|_{i\omega_{1}=i\omega_{2}=0}=1</math>
+
 
+
(d) Calculate <math>E\left[\mathbf{X}^{j}\mathbf{Y}^{k}\right]</math> .
+
 
+
<math>E\left[\mathbf{X}^{j}\mathbf{Y}^{k}\right]=\frac{\partial^{j+k}}{\partial\left(i\omega_{1}\right)^{j}\partial\left(i\omega_{2}\right)^{k}}\Phi_{\mathbf{XY}}\left(\omega_{1},\omega_{2}\right)|_{i\omega_{1}=i\omega_{2}=0}=\frac{\partial^{j+k}}{\partial\left(i\omega_{1}\right)^{j}\partial\left(i\omega_{2}\right)^{k}}\left[\left(1-i\omega_{1}\right)^{-1}\left(1-i\omega_{2}\right)^{-1}\right]|_{i\omega_{1}=i\omega_{2}=0}</math><math>=j!\left(1-i\omega_{1}\right)^{-\left(j+1\right)}k!\left(1-i\omega_{2}\right)^{-\left(k+1\right)}|_{i\omega_{1}=i\omega_{2}=0}=j!k!</math>
+
 
+
(e) Calculate the correlation coefficient <math>r_{\mathbf{XY}}</math>  between <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math> .
+
 
+
<math>r_{\mathbf{XY}}=\frac{Cov\left(\mathbf{X},\mathbf{Y}\right)}{\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}}=\frac{E\left[\mathbf{XY}\right]-E\left[\mathbf{X}\right]E\left[\mathbf{Y}\right]}{\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}}=\frac{1-1\cdot1}{\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}}=0.</math>
+
 
+
=Example. Geometric random variable=
+
 
+
Let <math>\mathbf{X}</math>  be a random variable with probability mass function
+
 
+
<math>p_{\mathbf{X}}\left(k\right)=\alpha\left(1-\alpha\right)^{k-1},k=1,2,3,\cdots</math>
+
 
+
where <math>0<\alpha<1</math> .
+
 
+
Note
+
 
+
This is a geometric random variable with success probability <math>\alpha</math> .
+
 
+
(a) Find the characteristic function of <math>\mathbf{X}</math> .
+
 
+
<math>\Phi_{\mathbf{X}}\left(\omega\right)=E\left[e^{i\omega\mathbf{X}}\right]=\sum_{k=1}^{\infty}e^{i\omega k}\alpha\left(1-\alpha\right)^{k-1}=\alpha e^{i\omega}\sum_{k=1}^{\infty}\left[e^{i\omega}\left(1-\alpha\right)\right]^{k-1}</math><math>=\alpha e^{i\omega}\sum_{m=0}^{\infty}\left[e^{i\omega}\left(1-\alpha\right)\right]^{m}=\frac{\alpha e^{i\omega}}{1-e^{i\omega}\left(1-\alpha\right)}\text{ (infinite geometric series)}</math>
+
 
+
since <math>\left|e^{i\omega}\left(1-\alpha\right)\right|<1</math> .
+
 
+
<math>\because0<1-\alpha<1</math>  and the real term of <math>e^{i\omega}=\cos\omega+i\sin\omega</math>  is <math>\left|\cos\omega\right|<1</math> .
+
 
+
(b) Find the mean of <math>\mathbf{X}</math> .
+
 
+
<math>E\left[\mathbf{X}\right]=\frac{d}{d\left(i\omega\right)}\Phi_{\mathbf{X}}\left(\omega\right)\left|_{i\omega=0}\right.=\frac{d}{d\left(i\omega\right)}\alpha e^{i\omega}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-1}\left|_{i\omega=0}\right.</math><math>=\left[\alpha e^{i\omega}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-1}+\alpha e^{i\omega}\cdot\left(-1\right)\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-2}\cdot\left(-e^{i\omega}\left(1-\alpha\right)\right)\right]\left|_{i\omega=0}\right.</math><math>=\left[\alpha e^{i\omega}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-1}+\alpha\left(1-\alpha\right)e^{i\omega2}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-2}\right]\left|_{i\omega=0}\right.</math><math>=\alpha\left(1-\left(1-\alpha\right)\right)^{-1}+\alpha\left(1-\alpha\right)\left(1-\left(1-\alpha\right)\right)^{-2}=1+\frac{\left(1-\alpha\right)}{\alpha}=\frac{1}{\alpha}.</math>
+
 
+
Note
+
 
+
You can see the other approach to find <math>E\left[\mathbf{X}\right]</math>  and <math>Var\left[\mathbf{X}\right]</math>  [CS1GeometricDistribution].
+
 
+
(c) Find the variance of <math>\mathbf{X}</math> .
+
 
+
<math>E\left[\mathbf{X}^{2}\right]=\frac{d^{2}}{d\left(i\omega\right)^{2}}\Phi_{\mathbf{X}}\left(\omega\right)\left|_{i\omega=0}\right.=\frac{d^{2}}{d\left(i\omega\right)^{2}}\alpha e^{i\omega}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-1}\left|_{i\omega=0}\right.</math><math>=\frac{d}{d\left(i\omega\right)}\left[\alpha e^{i\omega}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-1}+\alpha\left(1-\alpha\right)e^{i\omega2}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-2}\right]\left|_{i\omega=0}\right.</math><math>=\frac{d}{d\left(i\omega\right)}\alpha e^{i\omega}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-1}\left|_{i\omega=0}\right.+\alpha\left(1-\alpha\right)\frac{d}{d\left(i\omega\right)}e^{i\omega2}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-2}\left|_{i\omega=0}\right.</math><math>=\frac{1}{\alpha}+\alpha\left(1-\alpha\right)\frac{2}{\alpha^{3}}=\frac{\alpha}{\alpha^{2}}+\frac{2-2\alpha}{\alpha^{2}}=\frac{2-\alpha}{\alpha^{2}}</math>
+
 
+
because
+
 
+
 
+
 
+
<math>\frac{d}{d\left(i\omega\right)}e^{i\omega2}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-2}\left|_{i\omega=0}\right. =2e^{i\omega2}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-2}+e^{i\omega2}\left(-2\right)\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-3}\left(-e^{i\omega}\left(1-\alpha\right)\right)\left|_{i\omega=0}\right.</math><math>=2e^{i\omega2}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-2}+2\left(1-\alpha\right)e^{i\omega3}\left(1-e^{i\omega}\left(1-\alpha\right)\right)^{-3}\left|_{i\omega=0}\right.</math><math>=2\alpha^{-2}+2\left(1-\alpha\right)\alpha^{-3}=\frac{2\alpha+2-2\alpha}{\alpha^{3}}=\frac{2}{\alpha^{3}}.</math>
+
 
+
<math>Var\left[\mathbf{X}\right]=E\left[\mathbf{X}^{2}\right]-\left(E\left[\mathbf{X}\right]\right)^{2}=\frac{2-\alpha}{\alpha^{2}}-\frac{1}{\alpha^{2}}=\frac{1-\alpha}{\alpha^{2}}.</math>
+
 
+
=Example. Sequence of binomially distributed random variables=
+
 
+
Let <math>\left\{ \mathbf{X}_{n}\right\} _{n\geq1}</math>  be a sequence of binomially distributed random variables, with the <math>n_{th}</math>  random variable <math>\mathbf{X}_{n}</math>  having pmf
+
 
+
<math>P_{\mathbf{X}_{n}}(k)=P\left(\left\{ \mathbf{X}_{n}=k\right\} \right)=\left(\begin{array}{c}
+
n\\
+
k
+
\end{array}\right)p_{n}^{k}\left(1-p_{n}\right)^{n-k}\;,\; k=0,1,\cdots,n,\; p_{n}\in\left(0,1\right).</math> Show that, if the <math>p_{n}</math>  have the property that <math>np_{n}\rightarrow\lambda</math>  as <math>n\rightarrow\infty</math> , where <math>\lambda</math>  is a positive constant, then the sequence <math>\left\{ \mathbf{X}_{n}\right\} _{n\leq1}</math>  converges in distribution to a Poisson random variable <math>\mathbf{X}</math>  with mean <math>\lambda</math> .
+
 
+
Hint:
+
 
+
You may find the following fact useful:
+
 
+
<math>\lim_{n\rightarrow\infty}\left(1+\frac{x}{n}\right)^{n}=e^{x}.</math>
+
 
+
Solution
+
 
+
If <math>\mathbf{X}_{n}</math>  converges to <math>\mathbf{X}</math>  in distribution, then <math>F_{\mathbf{X}_{n}}(x)\rightarrow F_{\mathbf{X}}(x)</math><math>  \forall x\in\mathbf{R}</math> , where <math>F_{\mathbf{X}}(x)</math>  is continuous. This occurs iff <math>\Phi_{\mathbf{X}_{n}}(\omega)\rightarrow\Phi_{\mathbf{X}}(\omega)</math><math>  \forall x\in\mathbf{R}</math> . We will show that <math>\Phi_{\mathbf{X}_{n}}(\omega)</math>  converges to <math>e^{-\lambda\left(1-e^{i\omega}\right)}</math>  as <math>n\rightarrow\infty</math> , which is the characteristic function of a Poisson random variable with mean <math>\lambda</math> .
+
 
+
<math>\Phi_{\mathbf{X}_{n}}(\omega)=E\left[e^{i\omega\mathbf{X}_{n}}\right]=\sum_{k=0}^{n}e^{i\omega k}\left(\begin{array}{c}
+
n\\
+
k
+
\end{array}\right)p_{n}^{k}\left(1-p_{n}\right)^{n-k}=\sum_{k=0}^{n}\left(\begin{array}{c}
+
n\\
+
k
+
\end{array}\right)\left(p_{n}e^{i\omega}\right)^{k}\left(1-p_{n}\right)^{n-k}</math><math>=\left(p_{n}e^{i\omega}+1-p_{n}\right)^{n}=\left(1+p_{n}\left(e^{i\omega}-1\right)\right)^{n}.</math>
+
 
+
Now as <math>n\rightarrow\infty</math> , <math>np_{n}\rightarrow\lambda\Rightarrow p_{n}\rightarrow\frac{\lambda}{n}</math> .
+
 
+
<math>\lim_{n\rightarrow\infty}\Phi_{\mathbf{X}_{n}}(\omega)=\lim_{n\rightarrow\infty}\left(1+p_{n}\left(e^{i\omega}-1\right)\right)^{n}=\lim_{n\rightarrow\infty}\left(1+\frac{\lambda}{n}\left(e^{i\omega}-1\right)\right)^{n}=e^{\lambda\left(e^{i\omega}-1\right)}=e^{-\lambda\left(1-e^{i\omega}\right)},</math>
+
 
+
which is the characteristic function of Poisson random variable with mean <math>\lambda</math> .
+
 
+
c.f.
+
 
+
The problem 2 of the August 2007 QE [CS1QE2007August] is identical to this example.
+
 
+
=Example. Sequence of exponentially distributed random variables=
+
 
+
Let <math>\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math>  be a collection of i.i.d.  exponentially distributed random variables, each having mean <math>\mu</math> . Define
+
 
+
<math>\mathbf{Y}=\max\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\}</math> 
+
 
+
and
+
 
+
<math>\mathbf{Z}=\min\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\}</math> .
+
 
+
(a) Find the pdf of <math>\mathbf{Y}</math> .
+
 
+
<math>F_{\mathbf{Y}}\left(y\right)=P\left(\left\{ \mathbf{Y}\leq y\right\} \right)=P\left(\left\{ \max\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\} \leq y\right\} \right)=P\left(\left\{ \mathbf{X}_{1}\leq y\right\} \cap\left\{ \mathbf{X}_{2}\leq y\right\} \cap\cdots\cap\left\{ \mathbf{X}_{n}\leq y\right\} \right)</math><math>=P\left(\left\{ \mathbf{X}_{1}\leq y\right\} \right)P\left(\left\{ \mathbf{X}_{2}\leq y\right\} \right)\cdots P\left(\left\{ \mathbf{X}_{n}\leq y\right\} \right)=\left(F_{\mathbf{X}}\left(y\right)\right)^{n}=\left(1-e^{-y/\mu}\right)^{n}\cdot\mathbf{1}_{\left[0,\infty\right)}\left(y\right)</math>
+
 
+
<math>f_{\mathbf{Y}}(y)=\frac{dF_{\mathbf{Y}}(y)}{dy}=n\left(1-e^{-y/\mu}\right)^{n-1}\cdot\frac{1}{\mu}e^{-y/\mu}\cdot\mathbf{1}_{\left[0,\infty\right)}\left(y\right)=\frac{n}{\mu}e^{-y/\mu}\left(1-e^{-y/\mu}\right)^{n-1}\cdot\mathbf{1}_{\left[0,\infty\right)}\left(y\right)</math>
+
 
+
(b) Find the pdf of <math>\mathbf{Z}</math>
+
 
+
<math>F_{\mathbf{Z}}(z)=P\left(\left\{ \mathbf{Z}\leq z\right\} \right)=1-P\left(\left\{ \mathbf{Z}>z\right\} \right)=1-P\left(\left\{ \min\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\} >z\right\} \right)</math><math>=1-P\left(\left\{ \mathbf{X}_{1}>z\right\} \cap\left\{ \mathbf{X}_{2}>z\right\} \cap\cdots\cap\left\{ \mathbf{X}_{n}>z\right\} \right)=1-\left(1-F_{\mathbf{X}}(z)\right)^{n}</math><math>=\left[1-\left(1-\left(1-e^{-z/\mu}\right)\right)^{n}\right]\cdot\mathbf{1}_{\left[0,\infty\right)}\left(y\right)</math><math>=\left[1-\left(e^{-z/\mu}\right)^{n}\right]\cdot\mathbf{1}_{\left[0,\infty\right)}\left(y\right)=\left(1-e^{-nz/\mu}\right)\cdot\mathbf{1}_{\left[0,\infty\right)}\left(y\right)</math>
+
 
+
<math>f_{Z}(z)=\frac{dF_{\mathbf{Z}}(z)}{dz}=\frac{n}{\mu}e^{-nz/\mu}\cdot\mathbf{1}_{\left[0,\infty\right)}\left(y\right)</math>
+
 
+
(c) In words, give as complete a description of the random variable <math>\mathbf{Z}</math>  as you can.
+
 
+
<math>\mathbf{Z}</math>  is an exponetially distributed random variable with mean <math>\frac{\mu}{n}</math> .
+
 
+
=Example. Sequence of uniformly distributed random variables=
+
 
+
Let <math>\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math>  be <math>n</math>  i.i.d.  jointly distributed random variables, each uniformly distributed on the interval <math>\left[0,1\right]</math> . Define the new random variables <math>\mathbf{W}=\max\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\}</math> 
+
 
+
and
+
 
+
<math>\mathbf{Z}=\min\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\}</math> .
+
 
+
(a) Find the pdf of <math>\mathbf{W}</math>
+
 
+
<math>F_{\mathbf{W}}(w)=P\left(\left\{ \mathbf{W}\leq w\right\} \right)=P\left(\left\{ \max\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\} \leq w\right\} \right)=P\left(\left\{ \mathbf{X}_{1}\leq w\right\} \cap\left\{ \mathbf{X}_{2}\leq w\right\} \cap\cdots\cap\left\{ \mathbf{X}_{n}\leq w\right\} \right)</math><math>=P\left(\left\{ \mathbf{X}_{1}\leq w\right\} \right)P\left(\left\{ \mathbf{X}_{2}\leq w\right\} \right)\cdots P\left(\left\{ \mathbf{X}_{n}\leq w\right\} \right)=\left(F_{\mathbf{X}}\left(w\right)\right)^{n}</math> .
+
+
 
+
where <math>f_{\mathbf{X}}(x)=\mathbf{1}_{\left[0,1\right]}(x)</math>  and <math>F_{X}\left(x\right)=\left\{ \begin{array}{ll}
+
0  ,x<0\\
+
x  ,0\leq x<1\\
+
1  ,x\geq1
+
\end{array}\right.</math> .
+
 
+
<math>f_{\mathbf{W}}\left(w\right)=\frac{dF_{\mathbf{W}}\left(w\right)}{dw}=n\left[F_{\mathbf{X}}\left(w\right)\right]^{n-1}\cdot f_{\mathbf{X}}\left(w\right)=n\cdot w^{n-1}\cdot\mathbf{1}_{\left[0,1\right]}(w).</math>
+
 
+
(b) Find the pdf of <math>\mathbf{Z}</math> .
+
 
+
<math>F_{\mathbf{Z}}(z)=P\left(\left\{ \mathbf{Z}\leq z\right\} \right)=1-P\left(\left\{ \mathbf{Z}>z\right\} \right)=1-P\left(\left\{ \min\left\{ \mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}\right\} >z\right\} \right)</math><math>=1-P\left(\left\{ \mathbf{X}_{1}>z\right\} \cap\left\{ \mathbf{X}_{2}>z\right\} \cap\cdots\cap\left\{ \mathbf{X}_{n}>z\right\} \right)=1-\left(1-F_{\mathbf{X}}(z)\right)^{n}.</math>
+
 
+
<math>f_{\mathbf{Z}}(z)=\frac{dF_{\mathbf{Z}}(z)}{dz}=n\left(1-F_{\mathbf{X}}(z)\right)^{n-1}f_{\mathbf{X}}(z)=n\left(1-z\right)^{n-1}\mathbf{1}_{\left[0,1\right]}\left(z\right).</math>
+
 
+
(c) Find the mean of <math>\mathbf{W}</math> .
+
 
+
<math>E\left[\mathbf{W}\right]=\int_{-\infty}^{\infty}wf_{\mathbf{w}}(w)dw=\int_{0}^{1}nw^{n}dw=\frac{n}{n+1}w^{n+1}|_{0}^{1}=\frac{n}{n+1}.</math>
+
 
+
=Example. Mean of i.i.d.  random variables=
+
 
+
Let <math>\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math>  be <math>M</math>  jointly distributed i.i.d.  random variables with mean <math>\mu</math>  and variance <math>\sigma^{2}</math> . Let <math>\mathbf{Y}_{M}=\frac{1}{M}\sum_{n=0}^{M}\mathbf{X}_{n}</math> .
+
 
+
(a) Find the variance of <math>\mathbf{Y}_{M}</math> .
+
 
+
<math>Var\left[\mathbf{Y}_{M}\right]=E\left[\mathbf{Y}_{M}^{2}\right]-\left(E\left[\mathbf{Y}_{M}\right]\right)^{2}. </math>
+
 
+
<math>E\left[\mathbf{Y}_{M}\right]=E\left[\frac{1}{M}\sum_{n=0}^{M}\mathbf{X}_{n}\right]=\frac{1}{M}\sum_{n=0}^{M}E\left[\mathbf{X}_{n}\right]=\frac{1}{M}\cdot M\cdot\mu=\mu.</math>
+
 
+
<math>E\left[\mathbf{Y}_{M}^{2}\right]=E\left[\frac{1}{M^{2}}\sum_{m=1}^{M}\sum_{n=1}^{M}\mathbf{X}_{m}\mathbf{X}_{n}\right]=\frac{1}{M^{2}}\sum_{m=1}^{M}\sum_{n=1}^{M}E\left[\mathbf{X}_{m}\mathbf{X}_{n}\right].</math>
+
 
+
Now <math>E\left[\mathbf{X}_{m}\mathbf{X}_{n}\right]=\begin{cases}
+
\begin{array}{ll}
+
E\left[\mathbf{X}_{m}^{2}\right]  ,m=n\\
+
E\left[\mathbf{X}_{m}\right]E\left[\mathbf{X}_{n}\right]  ,m\neq n
+
\end{array}\end{cases}</math>  because when <math>m\neq n</math> , <math>\mathbf{X}_{m}</math>  and <math>\mathbf{X}_{n}</math>  are independent <math>\Rightarrow  \mathbf{X}_{m}</math>  and <math>\mathbf{X}_{n}</math>  are uncorrelated.
+
 
+
<math>E\left[\mathbf{Y}_{M}^{2}\right]=\frac{1}{M^{2}}\left[M\left(\mu^{2}+\sigma^{2}\right)+M\left(M-1\right)\mu^{2}\right]=\frac{\left(\mu^{2}+\sigma^{2}\right)+\left(M-1\right)\mu^{2}}{M}=\frac{M\mu^{2}+\sigma^{2}}{M}.</math>
+
 
+
<math>Var\left[\mathbf{Y}_{M}\right]=\frac{M\mu^{2}+\sigma^{2}-M\mu^{2}}{M}=\frac{\sigma^{2}}{M}.</math>
+
 
+
(b) Now assume that the <math>\mathbf{X}_{1},\mathbf{X}_{2},\cdots,\mathbf{X}_{n}</math>  are identically distributed with with mean <math>\mu</math>  and variance <math>\sigma^{2}</math> , but they are only correlated rather than independent. Find the variance of <math>\mathbf{Y}_{M}</math> .
+
 
+
Again, <math>Var\left[\mathbf{Y}_{M}\right]=\frac{\sigma^{2}}{M}</math> , because only uncorrelatedness was used in part (a).
+
 
+
=Example. A sum of a random number of i.i.d.  Gaussians=
+
 
+
Let <math>\left\{ \mathbf{X}_{n}\right\}</math>  be a sequence of i.i.d.  Gaussian random variables, each having characteristic function
+
 
+
<math>\Phi_{\mathbf{X}}\left(\omega\right)=e^{i\mu\omega}e^{-\frac{1}{2}\sigma^{2}\omega^{2}}</math>. Let <math>\mathbf{N}</math>  be a Poisson random variable with pmf
+
 
+
<math>p(n)=\frac{e^{-\lambda}\lambda^{n}}{n!},\; n=0,1,2,\cdots,\;\lambda>0,</math> and assume <math>\mathbf{N}</math>  is statistically independent of <math>\left\{ \mathbf{X}_{n}\right\}</math>  . Define a new random variable
+
 
+
<math>\mathbf{Y}=\mathbf{X}_{1}+\mathbf{X}_{2}+\cdots+\mathbf{X}_{N}.</math>
+
 
+
Note
+
 
+
If <math>\mathbf{N}=0</math> , then <math>\mathbf{Y}=0</math> .
+
 
+
(a) Find the mean of <math>\mathbf{Y}</math> .
+
 
+
• Probability generating function of <math>\mathbf{N}</math>  is <math>P_{\mathbf{N}}\left(z\right)=E\left[z^{\mathbf{N}}\right]=\sum_{n=0}^{\infty}z^{n}\frac{e^{-\lambda}\lambda^{n}}{n!}=e^{-\lambda}\sum_{n=0}^{\infty}\frac{\left(z\lambda\right)^{n}}{n!}=e^{-\lambda}e^{z\lambda}=e^{-\lambda\left(1-z\right)}.</math>
+
 
+
• The characteristic function of <math>\mathbf{Y}</math>  is <math>\Phi_{\mathbf{Y}}\left(\omega\right)=P_{\mathbf{N}}\left(z\right)\Bigl|_{z=\Phi_{\mathbf{X}}\left(\omega\right)}=e^{-\lambda\left(1-z\right)}\Bigl|_{z=e^{i\mu\omega}e^{-\frac{1}{2}\sigma^{2}\omega^{2}}}=e^{-\lambda\left(1-e^{i\mu\omega}e^{-\frac{1}{2}\sigma^{2}\omega^{2}}\right)}.</math>
+
 
+
• Now, we can get the mean of <math>\mathbf{Y}</math>  using the characteristic function. <math>E\left[\mathbf{Y}\right]=\frac{d}{d\left(i\omega\right)}\Phi_{\mathbf{Y}}\left(\omega\right)\Bigl|_{i\omega=0}=e^{-\lambda}\cdot\frac{d}{d\left(i\omega\right)}e^{\lambda e^{\mu\left(i\omega\right)+\frac{1}{2}\sigma^{2}\left(i\omega\right)^{2}}}\Bigl|_{i\omega=0}</math><math>=e^{-\lambda}\cdot e^{\lambda e^{\mu\left(i\omega\right)+\frac{1}{2}\sigma^{2}\left(i\omega\right)^{2}}}\cdot\lambda e^{\mu\left(i\omega\right)+\frac{1}{2}\sigma^{2}\left(i\omega\right)^{2}}\cdot\left(\mu+\sigma^{2}\left(i\omega\right)\right)\Bigl|_{i\omega=0}</math><math>=e^{-\lambda}\cdot\lambda\cdot\mu=\lambda\mu e^{-\lambda}.</math>
+

Latest revision as of 10:57, 30 November 2010

5 Exams

Problem Examples


Back to ECE600

Alumni Liaison

Meet a recent graduate heading to Sweden for a Postdoctorate.

Christine Berkesch