(New page: ==Theorem== The linear combination of independent Gaussian random variables is also Gaussian. <math>\Leftrightarrow</math> If <math>X_1, X_2,...,X_n</math> are <math>n</math> independe...)
 
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
 +
[[Category:math]]
 +
[[Category:tutorial]]
 +
[[Category:probability]]
 +
 
==Theorem==
 
==Theorem==
  
Line 36: Line 40:
 
<math>\Rightarrow M_{a_iX_i}(t) = E[e^{t(a_iX_i)}] = e^{a_i\mu_it + \frac{1}{2}\sigma^2a_i^2t^2}</math>
 
<math>\Rightarrow M_{a_iX_i}(t) = E[e^{t(a_iX_i)}] = e^{a_i\mu_it + \frac{1}{2}\sigma^2a_i^2t^2}</math>
  
Since <math>X_i</math> are independent with respect to each other, and <math>Y</math> is a linear combination of them, we know that <math>M_Y(t)</math> is a product of the individual <math>M_{Xi}(t)</math> (proof), i.e. <br/>
+
Since <math>X_i</math> are independent with respect to each other, and <math>Y</math> is a linear combination of them, we know that <math>M_Y(t)</math> is a product of the individual <math>M_{Xi}(t)</math> [[MGFs_of_LCs_of_independent_RVs_mhossain|(proof)]], i.e. <br/>
 
<math>\begin{align}
 
<math>\begin{align}
 
Y &= \sum_{i=1}^n a_i X_i \\
 
Y &= \sum_{i=1}^n a_i X_i \\
Line 58: Line 62:
  
 
----
 
----
 +
 +
[[Proofs_mhossain|Back to list of proofs]]

Latest revision as of 13:19, 13 June 2013


Theorem

The linear combination of independent Gaussian random variables is also Gaussian.

$ \Leftrightarrow $

If $ X_1, X_2,...,X_n $ are $ n $ independent Gaussian random variables, then the random variable $ Y $ is also Gaussian, where $ Y $ is a linear combination of $ X_i $'s, $ i = 1, 2,...,n $ given by
$ \begin{align} Y &= \sum_{i=1}^n a_i X_i \\ a_i &\in \mathbb{R} \end{align} $

$ \Leftrightarrow $

if
$ \begin{align} X_i &\sim N(\mu_i, \sigma_i^2) \\ i &= 1,2,...,n \end{align} $

and if
$ \begin{align} Y &= \sum_{i=1}^n a_i X_i \\ a_i &\in \mathbb{R} \end{align} $

then
$ Y\sim N(\sum_{i =1}^n a_i\mu_i, \sum_{i=1}^n a_i^2,\sigma_i^2) $


Proof

Let $ M_Y(t) $ be the moment generating function of $ Y $
Let $ M_{Xi}(t) $ be the moment generating function of $ X_i $. Since $ X_i $ are Gaussian varianbles with mean $ \mu_i $s and variances $ \sigma^2 $s respectively, we have that
$ M_{X_i}(t) = E[e^{tX_i}] = e^{\mu_it + \frac{1}{2}\sigma^2t^2} $
$ \Rightarrow M_{a_iX_i}(t) = E[e^{t(a_iX_i)}] = e^{a_i\mu_it + \frac{1}{2}\sigma^2a_i^2t^2} $

Since $ X_i $ are independent with respect to each other, and $ Y $ is a linear combination of them, we know that $ M_Y(t) $ is a product of the individual $ M_{Xi}(t) $ (proof), i.e.
$ \begin{align} Y &= \sum_{i=1}^n a_i X_i \\ a_i &\in \mathbb{R} \end{align} $
$ \begin{align} \Rightarrow M_Y(t) &= \prod_{i=1}^n M_{X_i}(t) \\ &= \prod_{i=1}^n e^{a_i\mu_i + \frac{1}{2}\sigma^2a_i^2t^2} \\ &= e^{(\sum_{i=1}^n a_i\mu_it) + \frac{1}{2}(\sum_{i=1}^n\sigma^2a_i^2t^2)} \end{align} $

Note that $ M_Y(t) $ is the moment generating function of a Gaussian variable with mean $ \mu $ and variance $ \sigma^2 $ where
$ \begin{align} \mu &= \sum_{i =1}^n a_i\mu_i \\ \sigma^2 &= \sum_{i=1}^n a_i^2,\sigma_i^2 \end{align} $

Thus we have that
$ Y\sim N(\sum_{i =1}^n a_i\mu_i, \sum_{i=1}^n a_i^2,\sigma_i^2)_{\blacksquare} $



Back to list of proofs

Alumni Liaison

Ph.D. on Applied Mathematics in Aug 2007. Involved on applications of image super-resolution to electron microscopy

Francisco Blanco-Silva