Line 9: | Line 9: | ||
We first make the following assumptions: | We first make the following assumptions: | ||
− | * <math> | + | * the pdf of <math>X_1</math> and hence that of <math>X_2</math> are even functions |
− | * the Fourier transforms of | + | * the Fourier transforms of their pdfs exist |
then we want to show that <br/> | then we want to show that <br/> | ||
Line 22: | Line 22: | ||
To prove sufficiency, | To prove sufficiency, | ||
− | If <math>X_1</math> and <math>X_2</math> are two Gaussian distributed scalar random variables with mean <math>\mu</math> and variance <math>\sigma^2</math>, | + | If <math>X_1</math> and <math>X_2</math> are two Gaussian distributed scalar random variables with mean <math>\mu</math> and variance <math>\sigma^2</math>, and since <math>X</math> is a linear combinaion of two Gaussian variables, then <math>X</math> is also a Gaussian distributed random variable characterized by a mean and a variance. |
+ | |||
+ | <math>\begin{align} | ||
+ | E[X] &= E[X_1 - X_2] \\ | ||
+ | &= E[X_1] - E[X_2] \\ | ||
+ | &= \mu - \mu \\ | ||
+ | &= 0 | ||
+ | \end{align}</math> | ||
+ | |||
+ | proof | ||
+ | |||
+ | <math>\begin{align} | ||
+ | Var[X] &= Var[X_1 - X_2] \\ | ||
+ | &= Var[X_1] + Var[X_2] \\ | ||
+ | &= \sigma^2 - \sigma^2 \\ | ||
+ | &= 2\sigma^2 | ||
+ | \end{align}</math> | ||
+ | |||
+ | proof | ||
+ | |||
+ | Therefore we have that <br/> | ||
+ | <math>X \sim N(0,2\sigma^2)</math> | ||
+ | |||
+ | Since <math>Y=X^2</math>, <br/> | ||
+ | <math>\begin{align} | ||
+ | F_Y(y) &= Pr[Y \leq y], y \in [0,\infty) \\ | ||
+ | &= Pr[-\sqrt{y} \leq X \leq \sqrt{y}] \\ | ||
+ | &= F_X(\sqrt{y}) - F_X(-\sqrt{y}) \\ | ||
+ | &= 2F_X(\sqrt{y})- 1 | ||
+ | \end{align}</math> | ||
+ | |||
+ | <math>\begin{align} | ||
+ | \Rightarrow f_Y(y) &= \frac{d}{dy}F_Y(y) \\ | ||
+ | &= \frac{d}{dy} (2F_X(\sqrt{y})- 1) \\ | ||
+ | &= \frac{2}{2\sqrt{y}} f_X(\sqrt{y}) \\ | ||
+ | &= \frac{e^{-\frac{y}{2(2\sigma^2)}}}{\sqrt{2\pi(2\sigma^2)y}} | ||
+ | \end{align}</math> | ||
+ | |||
+ | Additionally, if <math>\sigma^2 = 1/2</math>, then <math>2\sigma^2 = 1</math>, and we have that <br/> | ||
+ | <math>f_Y(y) = \frac{e^{-\frac{y}{2}}}{\sqrt{2\pi y}}</math><br/> | ||
+ | i.e. <math>f_Y(y)</math> is the pdf of a variable with a chi-squared distribution with one degree of freedom. | ||
+ | |||
+ | |||
+ | |||
+ | |||
Revision as of 19:40, 9 June 2013
Given
$ X = X_1 - X_2 \ $
where $ X_1 $ and $ X_2 $ are iid scalar random variables.
Also, Given that $ Y $ is a chi-squared variable with 1 degree of freedom. So,
$ f_Y(y) = \frac{e^{-\frac{y}{2}}}{\sqrt{2\pi y}} $
Additionally, it is given that $ Y= X^2 $
We first make the following assumptions:
- the pdf of $ X_1 $ and hence that of $ X_2 $ are even functions
- the Fourier transforms of their pdfs exist
then we want to show that
$ X_1, X_2 \sim N(\mu,\frac{1}{2}) \Leftrightarrow f_Y(y) = \frac{e^{-\frac{y}{2}}}{\sqrt{2\pi y}} $
To prove sufficiency,
If $ X_1 $ and $ X_2 $ are two Gaussian distributed scalar random variables with mean $ \mu $ and variance $ \sigma^2 $, and since $ X $ is a linear combinaion of two Gaussian variables, then $ X $ is also a Gaussian distributed random variable characterized by a mean and a variance.
$ \begin{align} E[X] &= E[X_1 - X_2] \\ &= E[X_1] - E[X_2] \\ &= \mu - \mu \\ &= 0 \end{align} $
proof
$ \begin{align} Var[X] &= Var[X_1 - X_2] \\ &= Var[X_1] + Var[X_2] \\ &= \sigma^2 - \sigma^2 \\ &= 2\sigma^2 \end{align} $
proof
Therefore we have that
$ X \sim N(0,2\sigma^2) $
Since $ Y=X^2 $,
$ \begin{align} F_Y(y) &= Pr[Y \leq y], y \in [0,\infty) \\ &= Pr[-\sqrt{y} \leq X \leq \sqrt{y}] \\ &= F_X(\sqrt{y}) - F_X(-\sqrt{y}) \\ &= 2F_X(\sqrt{y})- 1 \end{align} $
$ \begin{align} \Rightarrow f_Y(y) &= \frac{d}{dy}F_Y(y) \\ &= \frac{d}{dy} (2F_X(\sqrt{y})- 1) \\ &= \frac{2}{2\sqrt{y}} f_X(\sqrt{y}) \\ &= \frac{e^{-\frac{y}{2(2\sigma^2)}}}{\sqrt{2\pi(2\sigma^2)y}} \end{align} $
Additionally, if $ \sigma^2 = 1/2 $, then $ 2\sigma^2 = 1 $, and we have that
$ f_Y(y) = \frac{e^{-\frac{y}{2}}}{\sqrt{2\pi y}} $
i.e. $ f_Y(y) $ is the pdf of a variable with a chi-squared distribution with one degree of freedom.
$ E[e^{-jty}] = E[e^{-jtx^2}] = \int_{-\infty}^{\infty}e^{-jtx^2}f_X(x)dx $
We want to show that
$ X_1, X_2 \sim N(\mu,\frac{1}{2}) \Leftrightarrow f_Y(y) = \frac{e^{-\frac{y}{2}}}{\sqrt{2\pi y}} $
Let $ u = x^2 $, then
$ dx = \frac{du}{2x} $
and we have that
$ \begin{align} E[e^{-jtx^2}] &= \int_{-\infty}^{\infty}e^{-jtx^2}f_X(x)dx \\ &= \int_{-\infty}^{\infty}e^{-jtu}\frac{f_X(x)}{2x}du \\ &= \int_{-\infty}^{\infty}e^{-jtu}\frac{f_X(\sqrt{u})}{2\sqrt{u}}du \\ &= \mathcal{F}\{\frac{f_X(\sqrt{x})}{2\sqrt{x}}\} \end{align} $
Since $ Y= X^2 $, we have that
$ \begin{align} \mathcal{F}\{\frac{f_X(\sqrt{x})}{2\sqrt{x}}\} &= \mathcal{F}\{\frac{e^{-\frac{x}{2}}}{\sqrt{2\pi x}}\} \\ \Leftrightarrow \frac{f_X(\sqrt{x})}{2\sqrt{x}} &= \frac{e^{-\frac{x}{2}}}{\sqrt{2\pi x}} \\ \Leftrightarrow f_X(\sqrt{x}) &= \frac{\sqrt{2}e^{-\frac{x}{2}}}{\sqrt{\pi}} \\ \Leftrightarrow f_X(x) &= \frac{\sqrt{2}e^{-\frac{x^2}{2}}}{\sqrt{\pi}} \end{align} $
Recall that $ X = X_1 - X_2 $, where $ X_1 $ and $ X_2 $ are iid. Therefore,
$ \begin{align} f_1(x) * f_2(x) &= \frac{\sqrt{2}e^{\frac{-x^2}{2}}}{\sqrt{\pi}} \\ \Leftrightarrow f_1(x) * f_1(-x) &= \frac{\sqrt{2}e^{\frac{-x^2}{2}}}{\sqrt{\pi}} \\ \Leftrightarrow F_1(t) . F_1(-t) &= \frac{\sqrt{2}e^{\frac{-t^2}{2}}}{\sqrt{\pi}} \\ \Leftrightarrow |F_1(t)|^2 &= \frac{\sqrt{2}e^{\frac{-t^2}{2}}}{\sqrt{\pi}} \end{align} $
where $ g(t) $ is the phase.
$ F_1(t) $ is real if we can assume that $ X_1 $ and hence $ X_2 $ are even functions. Then $ g(t) = 0 $ and $ F_1(t) $ is given by
$ F_1(t) = $
Taking the inverse Fourier transform, we get
$ f_1(x) = \frac{2e^{-x^2}}{\sqrt{\pi}} $