(8 intermediate revisions by the same user not shown)
Line 9: Line 9:
  
 
We first make the following assumptions:
 
We first make the following assumptions:
* <math>f_{X_1}(x)</math>, and hence <math>f_{X_2}(x)</math> are even functions.
+
* the pdf of <math>X_1</math> and hence that of <math>X_2</math> are even functions
* the Fourier transforms of <math>f_{X_1}(x)</math> and <math>f_{X_2}(x)</math> exist.
+
* the Fourier transforms of their pdfs exist
  
 
then we want to show that <br/>
 
then we want to show that <br/>
Line 19: Line 19:
  
 
----
 
----
 +
  
 
To prove sufficiency,  
 
To prove sufficiency,  
  
If <math>X_1</math> and <math>X_2</math> are two Gaussian distributed scalar random variables with mean <math>\mu</math> and variance <math>\sigma^2</math>,  
+
If <math>X_1</math> and <math>X_2</math> are independent Gaussian distributed scalar random variables with mean <math>\mu</math> and variance <math>\sigma^2</math>, and <math>X</math> is a linear combination of the two Gaussian variables, then <math>X</math> is also a Gaussian distributed random variable [[Linear_combinations_of_independent_gaussian_RVs|(proof)]] characterized by a mean and a variance.
  
 +
<math>\begin{align}
 +
E[X] &= E[X_1 - X_2] \\
 +
&= E[X_1] - E[X_2] \\
 +
&= \mu - \mu \\
 +
&= 0 
 +
\end{align}</math>
  
 +
[[Lineariy_of_expectation_proof_mhossain|proof]]
  
 +
<math>\begin{align}
 +
Var[X] &= Var[X_1 - X_2] \\
 +
&= Var[X_1] + Var[X_2] \\
 +
&= \sigma^2 - \sigma^2 \\
 +
&= 2\sigma^2 
 +
\end{align}</math>
  
 +
[[Variance_of_LC_of_RVs|proof]]
  
 +
Therefore we have that <br/>
 +
<math>X \sim N(0,2\sigma^2) \ </math>
  
 +
Since <math>Y=X^2</math>, <br/>
 +
<math>\begin{align}
 +
F_Y(y) &= Pr[Y \leq y], y \in [0,\infty) \\
 +
&= Pr[-\sqrt{y} \leq X \leq \sqrt{y}] \\
 +
&= F_X(\sqrt{y}) - F_X(-\sqrt{y}) \\
 +
&= 2F_X(\sqrt{y})- 1
 +
\end{align}</math>
  
 +
<math>\begin{align}
 +
\Rightarrow f_Y(y) &= \frac{d}{dy}F_Y(y) \\
 +
&= \frac{d}{dy} (2F_X(\sqrt{y})- 1) \\
 +
&= \frac{2}{2\sqrt{y}} f_X(\sqrt{y}) \\
 +
&= \frac{e^{-\frac{y}{2(2\sigma^2)}}}{\sqrt{2\pi(2\sigma^2)y}}
 +
\end{align}</math>
 +
 +
Additionally, if <math>\sigma^2 = 1/2</math>, then <math>2\sigma^2 = 1</math>, and we have that <br/>
 +
<math>f_Y(y) = \frac{e^{-\frac{y}{2}}}{\sqrt{2\pi y}}</math><br/>
 +
i.e. <math>f_Y(y)</math> is the pdf of a variable with a chi-squared distribution with one degree of freedom.
 +
 +
 +
----
 +
 +
 +
To prove necessity, if <br/>
 +
<math>f_Y(y) = \frac{e^{-\frac{y}{2(2\sigma^2)}}}{\sqrt{2\pi(2\sigma^2)y}}</math>
 +
 +
Let <math>Y = X^2</math>. Then let <math>W = |X|</math>, where <br/>
 +
<math>|X| = \sqrt{Y}</math>
 +
 +
Then we have that
 +
<math>\begin{align}
 +
F_W(w) &= Pr[W \leq w], w\in[0,\infty) \\
 +
&= Pr[\sqrt{Y} \leq w] \\
 +
&= Pr[Y \leq w^2] \\
 +
&= F_Y(w^2)
 +
\end{align}</math>
 +
 +
<math>\begin{align}
 +
\Rightarrow f_W(w) &= \frac{d}{dw}F_Y(w^2) \\
 +
&= 2wF_W(w^2) \\
 +
&=\frac{2e^{-\frac{w^2}{2(2\sigma^2)}}}{\sqrt{2\pi(2\sigma^2)}}
 +
\end{align}</math>
 +
 +
if <math>2\sigma^2 = 1</math>, then we have that <br/>
 +
<math>
 +
f_W(w) =
 +
\begin{cases}
 +
\frac{2e^{-\frac{w^2}{2}}}{\sqrt{2\pi}} & w\geq 0 \\
 +
0 & else
 +
\end{cases}
 +
</math>
 +
 +
We know that <br/>
 +
<math>\begin{align}
 +
X = X_1 - X_2 \\
 +
\Rightarrow f_X(x) &= f_{X_1}(x)*f_{X_2}(-x) \\
 +
&= f_{X_1}(x)*f_{X_2}(x) \\
 +
&= f_{X_1}(x)*f_{X_1}(x)
 +
\end{align}</math><br/>
 +
proof
 +
 +
Since <math>f_X(x) </math> is the convolution of two even functions, <math>f_X(x)</math> is also even. (proof)<br/>
 +
If <math>f_X(x)</math> is even and <math>|X|=W</math>, then <br/>
 +
<math>\begin{align}
 +
f_X(x) &= \frac{e^{-\frac{x^2}{2(2\sigma^2)}}}{\sqrt{2\pi(2\sigma^2)}}, x\in (-\infty, \infty) \\
 +
&= \frac{e^{-\frac{x^2}{2}}}{\sqrt{2\pi}}
 +
\end{align}</math>
 +
 +
Recall that <br/>
 +
<math>f_{X_1}(x)*f_{X_1}(x) = f_X(x) </math><br/>
 +
<math>\Rightarrow \mathcal{F}\{f_{X_1}(x)*f_{X_1}(x)\} = \mathcal{F}\{f_X(x)\} </math><br/>
 +
<math>\Rightarrow \mathcal{F}\{f_{X_1}(x)\}\mathcal{F}\{*f_{X_1}(x)\} = \mathcal{F}\{f_X(x)\} </math><br/>
 +
<math>\Rightarrow (\mathcal{F}\{f_{X_1}(x)\})^2 = \mathcal{F}\{f_X(x)\} </math>
 +
 +
<math>\begin{align}
 +
\Rightarrow f_{X_1}(x) &= \sqrt{\mathcal{F}\{f_X(x)\}} \\
 +
&= \sqrt{\mathcal{F}\{\frac{e^{-\frac{x^2}{2(2\sigma^2)}}}{\sqrt{2\pi(2\sigma^2)}}\}} \\
 +
&= \sqrt{\frac{e^{-\frac{(2\sigma^2)\omega^2}{2}}}{\sqrt{2\pi}}} \\
 +
&= \frac{e^{-\frac{(2\sigma^2)\omega^2}{4}}}{^4\sqrt{2\pi}}\\
 +
\Rightarrow f_{X_1}(x) = \frac{e^{-\frac{x^2}{2\sigma^2}}}{^4\sqrt{2\pi}\sqrt{2\sigma^2}}
 +
\end{align}</math>
 +
 +
It seems that we are off by a factor of <math>(2\pi)^{-1/4}</math>
 +
 +
 +
----
  
 +
== Scratch ==
  
 
<math>E[e^{-jty}] = E[e^{-jtx^2}] = \int_{-\infty}^{\infty}e^{-jtx^2}f_X(x)dx </math>
 
<math>E[e^{-jty}] = E[e^{-jtx^2}] = \int_{-\infty}^{\infty}e^{-jtx^2}f_X(x)dx </math>

Latest revision as of 12:20, 13 June 2013

Given
$ X = X_1 - X_2 \ $
where $ X_1 $ and $ X_2 $ are iid scalar random variables.

Also, Given that $ Y $ is a chi-squared variable with 1 degree of freedom. So,
$ f_Y(y) = \frac{e^{-\frac{y}{2}}}{\sqrt{2\pi y}} $

Additionally, it is given that $ Y= X^2 $

We first make the following assumptions:

  • the pdf of $ X_1 $ and hence that of $ X_2 $ are even functions
  • the Fourier transforms of their pdfs exist

then we want to show that

$ X_1, X_2 \sim N(\mu,\frac{1}{2}) \Leftrightarrow f_Y(y) = \frac{e^{-\frac{y}{2}}}{\sqrt{2\pi y}} $




To prove sufficiency,

If $ X_1 $ and $ X_2 $ are independent Gaussian distributed scalar random variables with mean $ \mu $ and variance $ \sigma^2 $, and $ X $ is a linear combination of the two Gaussian variables, then $ X $ is also a Gaussian distributed random variable (proof) characterized by a mean and a variance.

$ \begin{align} E[X] &= E[X_1 - X_2] \\ &= E[X_1] - E[X_2] \\ &= \mu - \mu \\ &= 0 \end{align} $

proof

$ \begin{align} Var[X] &= Var[X_1 - X_2] \\ &= Var[X_1] + Var[X_2] \\ &= \sigma^2 - \sigma^2 \\ &= 2\sigma^2 \end{align} $

proof

Therefore we have that
$ X \sim N(0,2\sigma^2) \ $

Since $ Y=X^2 $,
$ \begin{align} F_Y(y) &= Pr[Y \leq y], y \in [0,\infty) \\ &= Pr[-\sqrt{y} \leq X \leq \sqrt{y}] \\ &= F_X(\sqrt{y}) - F_X(-\sqrt{y}) \\ &= 2F_X(\sqrt{y})- 1 \end{align} $

$ \begin{align} \Rightarrow f_Y(y) &= \frac{d}{dy}F_Y(y) \\ &= \frac{d}{dy} (2F_X(\sqrt{y})- 1) \\ &= \frac{2}{2\sqrt{y}} f_X(\sqrt{y}) \\ &= \frac{e^{-\frac{y}{2(2\sigma^2)}}}{\sqrt{2\pi(2\sigma^2)y}} \end{align} $

Additionally, if $ \sigma^2 = 1/2 $, then $ 2\sigma^2 = 1 $, and we have that
$ f_Y(y) = \frac{e^{-\frac{y}{2}}}{\sqrt{2\pi y}} $
i.e. $ f_Y(y) $ is the pdf of a variable with a chi-squared distribution with one degree of freedom.




To prove necessity, if
$ f_Y(y) = \frac{e^{-\frac{y}{2(2\sigma^2)}}}{\sqrt{2\pi(2\sigma^2)y}} $

Let $ Y = X^2 $. Then let $ W = |X| $, where
$ |X| = \sqrt{Y} $

Then we have that $ \begin{align} F_W(w) &= Pr[W \leq w], w\in[0,\infty) \\ &= Pr[\sqrt{Y} \leq w] \\ &= Pr[Y \leq w^2] \\ &= F_Y(w^2) \end{align} $

$ \begin{align} \Rightarrow f_W(w) &= \frac{d}{dw}F_Y(w^2) \\ &= 2wF_W(w^2) \\ &=\frac{2e^{-\frac{w^2}{2(2\sigma^2)}}}{\sqrt{2\pi(2\sigma^2)}} \end{align} $

if $ 2\sigma^2 = 1 $, then we have that
$ f_W(w) = \begin{cases} \frac{2e^{-\frac{w^2}{2}}}{\sqrt{2\pi}} & w\geq 0 \\ 0 & else \end{cases} $

We know that
$ \begin{align} X = X_1 - X_2 \\ \Rightarrow f_X(x) &= f_{X_1}(x)*f_{X_2}(-x) \\ &= f_{X_1}(x)*f_{X_2}(x) \\ &= f_{X_1}(x)*f_{X_1}(x) \end{align} $
proof

Since $ f_X(x) $ is the convolution of two even functions, $ f_X(x) $ is also even. (proof)
If $ f_X(x) $ is even and $ |X|=W $, then
$ \begin{align} f_X(x) &= \frac{e^{-\frac{x^2}{2(2\sigma^2)}}}{\sqrt{2\pi(2\sigma^2)}}, x\in (-\infty, \infty) \\ &= \frac{e^{-\frac{x^2}{2}}}{\sqrt{2\pi}} \end{align} $

Recall that
$ f_{X_1}(x)*f_{X_1}(x) = f_X(x) $
$ \Rightarrow \mathcal{F}\{f_{X_1}(x)*f_{X_1}(x)\} = \mathcal{F}\{f_X(x)\} $
$ \Rightarrow \mathcal{F}\{f_{X_1}(x)\}\mathcal{F}\{*f_{X_1}(x)\} = \mathcal{F}\{f_X(x)\} $
$ \Rightarrow (\mathcal{F}\{f_{X_1}(x)\})^2 = \mathcal{F}\{f_X(x)\} $

$ \begin{align} \Rightarrow f_{X_1}(x) &= \sqrt{\mathcal{F}\{f_X(x)\}} \\ &= \sqrt{\mathcal{F}\{\frac{e^{-\frac{x^2}{2(2\sigma^2)}}}{\sqrt{2\pi(2\sigma^2)}}\}} \\ &= \sqrt{\frac{e^{-\frac{(2\sigma^2)\omega^2}{2}}}{\sqrt{2\pi}}} \\ &= \frac{e^{-\frac{(2\sigma^2)\omega^2}{4}}}{^4\sqrt{2\pi}}\\ \Rightarrow f_{X_1}(x) = \frac{e^{-\frac{x^2}{2\sigma^2}}}{^4\sqrt{2\pi}\sqrt{2\sigma^2}} \end{align} $

It seems that we are off by a factor of $ (2\pi)^{-1/4} $



Scratch

$ E[e^{-jty}] = E[e^{-jtx^2}] = \int_{-\infty}^{\infty}e^{-jtx^2}f_X(x)dx $

We want to show that

$ X_1, X_2 \sim N(\mu,\frac{1}{2}) \Leftrightarrow f_Y(y) = \frac{e^{-\frac{y}{2}}}{\sqrt{2\pi y}} $



Let $ u = x^2 $, then
$ dx = \frac{du}{2x} $

and we have that
$ \begin{align} E[e^{-jtx^2}] &= \int_{-\infty}^{\infty}e^{-jtx^2}f_X(x)dx \\ &= \int_{-\infty}^{\infty}e^{-jtu}\frac{f_X(x)}{2x}du \\ &= \int_{-\infty}^{\infty}e^{-jtu}\frac{f_X(\sqrt{u})}{2\sqrt{u}}du \\ &= \mathcal{F}\{\frac{f_X(\sqrt{x})}{2\sqrt{x}}\} \end{align} $

Since $ Y= X^2 $, we have that
$ \begin{align} \mathcal{F}\{\frac{f_X(\sqrt{x})}{2\sqrt{x}}\} &= \mathcal{F}\{\frac{e^{-\frac{x}{2}}}{\sqrt{2\pi x}}\} \\ \Leftrightarrow \frac{f_X(\sqrt{x})}{2\sqrt{x}} &= \frac{e^{-\frac{x}{2}}}{\sqrt{2\pi x}} \\ \Leftrightarrow f_X(\sqrt{x}) &= \frac{\sqrt{2}e^{-\frac{x}{2}}}{\sqrt{\pi}} \\ \Leftrightarrow f_X(x) &= \frac{\sqrt{2}e^{-\frac{x^2}{2}}}{\sqrt{\pi}} \end{align} $

Recall that $ X = X_1 - X_2 $, where $ X_1 $ and $ X_2 $ are iid. Therefore,
$ \begin{align} f_1(x) * f_2(x) &= \frac{\sqrt{2}e^{\frac{-x^2}{2}}}{\sqrt{\pi}} \\ \Leftrightarrow f_1(x) * f_1(-x) &= \frac{\sqrt{2}e^{\frac{-x^2}{2}}}{\sqrt{\pi}} \\ \Leftrightarrow F_1(t) . F_1(-t) &= \frac{\sqrt{2}e^{\frac{-t^2}{2}}}{\sqrt{\pi}} \\ \Leftrightarrow |F_1(t)|^2 &= \frac{\sqrt{2}e^{\frac{-t^2}{2}}}{\sqrt{\pi}} \end{align} $

where $ g(t) $ is the phase.

$ F_1(t) $ is real if we can assume that $ X_1 $ and hence $ X_2 $ are even functions. Then $ g(t) = 0 $ and $ F_1(t) $ is given by
$ F_1(t) = $

Taking the inverse Fourier transform, we get
$ f_1(x) = \frac{2e^{-x^2}}{\sqrt{\pi}} $

Alumni Liaison

To all math majors: "Mathematics is a wonderfully rich subject."

Dr. Paul Garrett