Line 75: Line 75:
  
 
* The definition of the characteristic function is inaccurate. The characteristic function of random variable <math>Z</math> should be <math>E[e^{itZ}]</math>, instead of <math>E[e^Z]</math>. However, the final formula of the characteristic function of <math>Z</math>, a function of <math>t</math>, is correct.
 
* The definition of the characteristic function is inaccurate. The characteristic function of random variable <math>Z</math> should be <math>E[e^{itZ}]</math>, instead of <math>E[e^Z]</math>. However, the final formula of the characteristic function of <math>Z</math>, a function of <math>t</math>, is correct.
 
 
* It would be nicer to mention that the characteristic function of a Gaussian random variable <math>X</math> is <math>e^{i\mu_Xt-\frac{1}{2}\sigma_X^2t^2}</math>, where <math>\mu_X</math> and <math>\sigma_X^2</math> are mean and variance of random variable <math>X</math>. That would make the uniqueness statement clearer.
 
* It would be nicer to mention that the characteristic function of a Gaussian random variable <math>X</math> is <math>e^{i\mu_Xt-\frac{1}{2}\sigma_X^2t^2}</math>, where <math>\mu_X</math> and <math>\sigma_X^2</math> are mean and variance of random variable <math>X</math>. That would make the uniqueness statement clearer.
  

Latest revision as of 00:54, 31 March 2015


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2011



Part 3

Jump to Part 1,2,4


Show that the sum of two jointly distributed Gaussian random variables that are not necessarily statistically independent is a Gaussian random variable.


Solution 1:

Suppose $ \mathbf{X} $ and $ \mathbf{Y} $ are jointly distributed Gaussian random variables with jointly pdf

$ f_{X,Y}(x,y)=\frac{1}{2\pi\sigma_X\sigma_Y\sqrt{1-r^2}}\text{exp}\left\{\frac{-1}{2(1-r^2)}\left(\frac{(x-\eta_X)^2}{\sigma_X^2}-2r\frac{(x-\eta_X)(y-\eta_Y)}{\sigma_X\sigma_Y}+\frac{(y-\eta_Y)^2}{\sigma_Y^2}\right)\right\} $

Then, compute the characteristic function of $ \mathbf{X}+\mathbf{Y} $:

$ \begin{align} \Phi_{X+Y}(\omega) &= E[e^{i\omega(X+Y)}]\\ &=\int_{-\infty}^\infty\int_{-\infty}^\infty e^{i\omega(x+y)}f_{X,Y}(x,y)dxdy \\ &=\int_{-\infty}^\infty e^{i\omega y}\left(\int_{-\infty}^\infty e^{i\omega x}f_{X,Y}(x,y)dx\right)dy \\ &=\int_{-\infty}^\infty e^{i\omega y}\left(\int_{-\infty}^\infty e^{i\omega x} \frac{1}{2\pi\sigma_X\sigma_Y\sqrt{1-r^2}}\text{exp}\left\{ \frac{-1}{2(1-r^2)}\left(\frac{(x-\eta_X)^2}{\sigma_X^2}-2r\frac{(x-\eta_X)(y-\eta_Y)}{\sigma_X\sigma_Y}+\frac{(y-\eta_Y)^2}{\sigma_Y^2}\right)\right\}dx\right)dy \\ &=\int_{-\infty}^\infty \frac{e^{i\omega y}}{\sqrt{2\pi}\sigma_Y}\left(\int_{-\infty}^\infty e^{i\omega x} \frac{1}{\sqrt{2\pi}\sigma_X\sqrt{1-r^2}}\text{exp}\left\{-\frac{(x-(\eta_X+\frac{\sigma_X}{\sigma_Y}r(y-\eta_Y)))^2}{2(1-r^2)\sigma_X^2}\right\}dx\right)\text{exp}\left\{-\frac{(y-\eta_Y)^2}{2\sigma_Y^2}\right\}dy \end{align} $

We know that the characteristic function of a Gaussian random variable with mean $ \mu $ and variance $ \sigma^2 $ is $ (e^{i\omega\mu-\frac{1}{2}\sigma^2\omega^2}) $. Then,

$ \begin{align} \Phi_{X+Y}(\omega) &= \int_{-\infty}^\infty \dfrac{e^{i\omega y}}{\sqrt{2\pi}\sigma_Y}\left(e^{i\omega(\eta_X+\frac{\sigma_X}{\sigma_Y}r(y-\eta_Y))-\frac{1}{2}(1-r^2)\sigma_X^2\omega^2}\right)\text{exp}\left\{-\dfrac{(y-\eta_Y)^2}{2\sigma_Y^2}\right\}dy \\ &= e^{i\omega(\eta_X-\frac{\sigma_X}{\sigma_Y}r\eta_Y)-\frac{1}{2}(1-r^2)\sigma_X^2\omega^2}\int_{-\infty}^\infty e^{i\omega(1+\frac{\sigma_X}{\sigma_Y}r)y}\cdot\dfrac{1}{\sqrt{2\pi}\sigma_Y}\text{exp}\left\{-\dfrac{(y-\eta_Y)^2}{2\sigma_Y^2}\right\}dy \\ &= e^{i\omega(\eta_X-\frac{\sigma_X}{\sigma_Y}r\eta_Y)-\frac{1}{2}(1-r^2)\sigma_X^2\omega^2}\cdot e^{i\omega(1+\frac{\sigma_X}{\sigma_Y}r)\eta_Y-\frac{1}{2}\sigma_Y^2\omega^2(1+\frac{\sigma_X}{\sigma_Y}r)^2} \\ &= e^{i\omega(\eta_X+\eta_Y)-\frac{1}{2}\omega^2(\sigma_X^2+2\sigma_X\sigma_Yr+\sigma_Y^2)} \end{align} $

So, $ \mathbf{X}+\mathbf{Y} $ is a Gaussian random variable with mean $ (\eta_X+\eta_Y) $ and variance $ (\sigma_X^2+2\sigma_X\sigma_Yr+\sigma_Y^2) $


Solution 2:

Let $ Z=X+Y $, where $ X\sim(\mu_1,\sigma_1^2) $ and $ Y\sim(\mu_2,\sigma_2^2) $.

The characteristic function

$ E(e^Z)=E(e^{X+Y})=e^{i(\mu_1+\mu_2)t-\frac{1}{2}\left(\sigma_1^2+\sigma_2^2+2\text{cov}(x,y)\right)t^2} $

according to the property of jointly distributed Gaussian random variable.

$ \therefore Z\sim\left(\mu_1+\mu_2, sigma_1^2+\sigma_2^2+2\text{cov}(x,y)\right) $ according to uniqueness of characteristic function.

Comments:

  • The definition of the characteristic function is inaccurate. The characteristic function of random variable $ Z $ should be $ E[e^{itZ}] $, instead of $ E[e^Z] $. However, the final formula of the characteristic function of $ Z $, a function of $ t $, is correct.
  • It would be nicer to mention that the characteristic function of a Gaussian random variable $ X $ is $ e^{i\mu_Xt-\frac{1}{2}\sigma_X^2t^2} $, where $ \mu_X $ and $ \sigma_X^2 $ are mean and variance of random variable $ X $. That would make the uniqueness statement clearer.


Related Problems:

$ \mathbf{X} $ and $ \mathbf{Y} $ are two exponential random variables with different means, $ \lambda_X $ and $ \lambda_Y $, respectively. Show that $ \text{Min}(X,Y) $ is also an exponential random variable.


"Communication, Networks, Signal, and Image Processing" (CS)- Question 1, August 2011

Go to


Back to ECE Qualifying Exams (QE) page

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang