Line 4: Line 4:
 
'''Fact'''
 
'''Fact'''
  
Let <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  be two jointly-distributed, statistically independent random variables, having pdfs <math>f_{\mathbf{X}}\left(x\right)</math>  and <math>f_{\mathbf{Y}}\left(y\right)</math> . Then the pdf of <math>\mathbf{Z}=\mathbf{X}+\mathbf{Y}</math>  is  
+
Let <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  be two jointly-distributed, statistically independent random variables, having pdfs <math class="inline">f_{\mathbf{X}}\left(x\right)</math>  and <math class="inline">f_{\mathbf{Y}}\left(y\right)</math> . Then the pdf of <math class="inline">\mathbf{Z}=\mathbf{X}+\mathbf{Y}</math>  is  
  
<math>f_{\mathbf{Z}}\left(z\right)=\left(f_{\mathbf{X}}*f_{\mathbf{Y}}\right)\left(z\right)=\int_{-\infty}^{\infty}f_{\mathbf{X}}\left(x\right)f_{\mathbf{Y}}\left(z-x\right)dx=\int_{-\infty}^{\infty}f_{\mathbf{Y}}\left(y\right)f_{\mathbf{X}}\left(z-y\right)dy</math>.  
+
<math class="inline">f_{\mathbf{Z}}\left(z\right)=\left(f_{\mathbf{X}}*f_{\mathbf{Y}}\right)\left(z\right)=\int_{-\infty}^{\infty}f_{\mathbf{X}}\left(x\right)f_{\mathbf{Y}}\left(z-x\right)dx=\int_{-\infty}^{\infty}f_{\mathbf{Y}}\left(y\right)f_{\mathbf{X}}\left(z-y\right)dy</math>.  
  
 
The discrete version of above equation is
 
The discrete version of above equation is
  
<math>P\left(\mathbf{Z}=z\right)=\sum_{k=-\infty}^{\infty}P\left(\mathbf{X}=k\right)P\left(\mathbf{Y}=z-k\right)=\sum_{k=-\infty}^{\infty}P\left(\mathbf{X}=z-k\right)P\left(\mathbf{Y}=k\right)</math>.  
+
<math class="inline">P\left(\mathbf{Z}=z\right)=\sum_{k=-\infty}^{\infty}P\left(\mathbf{X}=k\right)P\left(\mathbf{Y}=z-k\right)=\sum_{k=-\infty}^{\infty}P\left(\mathbf{X}=z-k\right)P\left(\mathbf{Y}=k\right)</math>.  
  
 
Example. Sum of two exponential random variables
 
Example. Sum of two exponential random variables
  
Let <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  be two jointly-distributed RVs having exponential distributions with mean <math>\mu</math> . Let <math>\mathbf{Z}=\mathbf{X}+\mathbf{Y}</math> . Find <math>f_{\mathbf{Z}}\left(z\right)</math> .
+
Let <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  be two jointly-distributed RVs having exponential distributions with mean <math class="inline">\mu</math> . Let <math class="inline">\mathbf{Z}=\mathbf{X}+\mathbf{Y}</math> . Find <math class="inline">f_{\mathbf{Z}}\left(z\right)</math> .
  
<math>f_{\mathbf{Z}}\left(z\right)</math>  
+
<math class="inline">f_{\mathbf{Z}}\left(z\right)</math>  
  
 
Joint Gaussian pdf
 
Joint Gaussian pdf
  
<math>f_{\mathbf{XY}}(x,y)=\frac{1}{2\pi\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}\sqrt{1-r^{2}}}\exp\left\{ \frac{-1}{2\left(1-r^{2}\right)}\left[\frac{\left(x-\mu_{\mathbf{X}}\right)^{2}}{\sigma_{\mathbf{X}}^{2}}-\frac{2r\left(x-\mu_{\mathbf{X}}\right)\left(y-\mu_{\mathbf{Y}}\right)}{\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}}+\frac{\left(y-\mu_{\mathbf{Y}}\right)^{2}}{\sigma_{\mathbf{Y}}^{2}}\right]\right\}</math>   
+
<math class="inline">f_{\mathbf{XY}}(x,y)=\frac{1}{2\pi\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}\sqrt{1-r^{2}}}\exp\left\{ \frac{-1}{2\left(1-r^{2}\right)}\left[\frac{\left(x-\mu_{\mathbf{X}}\right)^{2}}{\sigma_{\mathbf{X}}^{2}}-\frac{2r\left(x-\mu_{\mathbf{X}}\right)\left(y-\mu_{\mathbf{Y}}\right)}{\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}}+\frac{\left(y-\mu_{\mathbf{Y}}\right)^{2}}{\sigma_{\mathbf{Y}}^{2}}\right]\right\}</math>   
  
If <math>r=0</math>  (<math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  are uncorrelated),
+
If <math class="inline">r=0</math>  (<math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  are uncorrelated),
  
<math>f_{\mathbf{XY}}(x,y)</math>  
+
<math class="inline">f_{\mathbf{XY}}(x,y)</math>  
  
<math>\Longrightarrow\mathbf{X}</math>  and <math>\mathbf{Y}</math>  are independent.
+
<math class="inline">\Longrightarrow\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  are independent.
  
If two RVs <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  are uncorrelated Gaussian, then <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  are independent. This is the exception of “<math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  are independent <math>\left(\nLeftarrow\right)\Rightarrow  \mathbf{X}</math>  and <math>\mathbf{Y}</math>  are uncorrelated”.
+
If two RVs <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  are uncorrelated Gaussian, then <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  are independent. This is the exception of “<math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  are independent <math class="inline">\left(\nLeftarrow\right)\Rightarrow  \mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  are uncorrelated”.
  
 
Definition. Uncorrelation
 
Definition. Uncorrelation
Line 34: Line 34:
 
Two RVs \mathbf{X}  and \mathbf{Y}  are uncorrelated if their covariance is equal to zero. This is true if any of the following three equivalent condition is true:
 
Two RVs \mathbf{X}  and \mathbf{Y}  are uncorrelated if their covariance is equal to zero. This is true if any of the following three equivalent condition is true:
  
1. <math>Cov\left(\mathbf{X},\mathbf{Y}\right)=0</math>  
+
1. <math class="inline">Cov\left(\mathbf{X},\mathbf{Y}\right)=0</math>  
  
2. <math>r_{\mathbf{XY}}=0</math>  
+
2. <math class="inline">r_{\mathbf{XY}}=0</math>  
  
3. <math>E\left[\mathbf{XY}\right]=E\left[\mathbf{X}\right]E\left[\mathbf{Y}\right]</math>  
+
3. <math class="inline">E\left[\mathbf{XY}\right]=E\left[\mathbf{X}\right]E\left[\mathbf{Y}\right]</math>  
  
 
Note
 
Note
  
• <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  are uncorrelated <math>\Longleftrightarrow E\left[\mathbf{XY}\right]=E\left[\mathbf{X}\right]\cdot E\left[\mathbf{Y}\right]\Longleftrightarrow Cov\left(\mathbf{X},\mathbf{Y}\right)=0</math>  
+
• <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  are uncorrelated <math class="inline">\Longleftrightarrow E\left[\mathbf{XY}\right]=E\left[\mathbf{X}\right]\cdot E\left[\mathbf{Y}\right]\Longleftrightarrow Cov\left(\mathbf{X},\mathbf{Y}\right)=0</math>  
  
• <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  are independent <math>\Longleftrightarrow  f_{\mathbf{XY}}(x,y)=f_{\mathbf{X}}(x)\cdot f_{\mathbf{Y}}(y)</math>  
+
• <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  are independent <math class="inline">\Longleftrightarrow  f_{\mathbf{XY}}(x,y)=f_{\mathbf{X}}(x)\cdot f_{\mathbf{Y}}(y)</math>  
  
• <math>\mathbf{X}</math>  and <math>\mathbf{Y}</math>  are independent <math>\left(\nLeftarrow\right)\Longrightarrow \mathbf{X}</math>  and <math>\mathbf{Y}</math>  are uncorrelated
+
• <math class="inline">\mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  are independent <math class="inline">\left(\nLeftarrow\right)\Longrightarrow \mathbf{X}</math>  and <math class="inline">\mathbf{Y}</math>  are uncorrelated
 
Definition. Orthogonality
 
Definition. Orthogonality
  
Two RVs are orthogoal if <math>E\left[\mathbf{XY}\right]=0</math> .
+
Two RVs are orthogoal if <math class="inline">E\left[\mathbf{XY}\right]=0</math> .
  
 
Fact
 
Fact
  
If <math>E\left[\mathbf{X}^{2}\right]<\infty  and E\left[\mathbf{Y}^{2}\right]<\infty , then \left|E\left[\mathbf{XY}\right]\right|\leq\sqrt{E\left[\mathbf{X}\right]^{2}E\left[\mathbf{Y}\right]^{2}}</math>  with equality iff <math>\mathbf{Y}=a_{0}\mathbf{X}</math>  where <math>a_{0}</math>  is constant.
+
If <math class="inline">E\left[\mathbf{X}^{2}\right]<\infty  and E\left[\mathbf{Y}^{2}\right]<\infty , then \left|E\left[\mathbf{XY}\right]\right|\leq\sqrt{E\left[\mathbf{X}\right]^{2}E\left[\mathbf{Y}\right]^{2}}</math>  with equality iff <math class="inline">\mathbf{Y}=a_{0}\mathbf{X}</math>  where <math class="inline">a_{0}</math>  is constant.
  
 
Recall
 
Recall
  
For a quadratic equation <math>ax^{2}+bx+c=0,\; a\neq0</math> , the discriminant is <math>b^{2}-4ac</math> .
+
For a quadratic equation <math class="inline">ax^{2}+bx+c=0,\; a\neq0</math> , the discriminant is <math class="inline">b^{2}-4ac</math> .
  
 
Proof
 
Proof
  
<math>E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]\geq0\Longrightarrow a^{2}E\left[\mathbf{X}^{2}\right]-2aE\left[\mathbf{XY}\right]+E\left[\mathbf{Y}^{2}\right]\geq0.</math>  
+
<math class="inline">E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]\geq0\Longrightarrow a^{2}E\left[\mathbf{X}^{2}\right]-2aE\left[\mathbf{XY}\right]+E\left[\mathbf{Y}^{2}\right]\geq0.</math>  
  
 
n.b. LHS is a quadratic in a .
 
n.b. LHS is a quadratic in a .
  
Let's consider two cases: <math>(i)\; E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]>0 , (ii)\; E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]=0</math> .
+
Let's consider two cases: <math class="inline">(i)\; E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]>0 , (ii)\; E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]=0</math> .
  
(i)  <math>0<E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]=a^{2}E\left[\mathbf{X}^{2}\right]-2aE\left[\mathbf{XY}\right]+E\left[\mathbf{Y}^{2}\right]</math>  
+
(i)  <math class="inline">0<E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]=a^{2}E\left[\mathbf{X}^{2}\right]-2aE\left[\mathbf{XY}\right]+E\left[\mathbf{Y}^{2}\right]</math>  
<math>\Longrightarrow</math>  quadratic in a  has complex roots <math>\Longrightarrow</math>  “discriminant” of this quadratic is negative
+
<math class="inline">\Longrightarrow</math>  quadratic in a  has complex roots <math class="inline">\Longrightarrow</math>  “discriminant” of this quadratic is negative
  
<math>4a^{2}E\left[\mathbf{XY}\right]^{2}-4a^{2}E\left[\mathbf{X}^{2}\right]E\left[\mathbf{Y}^{2}\right]</math>  
+
<math class="inline">4a^{2}E\left[\mathbf{XY}\right]^{2}-4a^{2}E\left[\mathbf{X}^{2}\right]E\left[\mathbf{Y}^{2}\right]</math>  
  
(ii)  <math>0=E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]=a^{2}E\left[\mathbf{X}^{2}\right]-2aE\left[\mathbf{XY}\right]+E\left[\mathbf{Y}^{2}\right]</math> for some <math>a=a_{0}</math> .
+
(ii)  <math class="inline">0=E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]=a^{2}E\left[\mathbf{X}^{2}\right]-2aE\left[\mathbf{XY}\right]+E\left[\mathbf{Y}^{2}\right]</math> for some <math class="inline">a=a_{0}</math> .
  
n.b. The discriminant is equal to <math>0</math>  when <math>\mathbf{Y}=a_{0}\mathbf{X}</math> .
+
n.b. The discriminant is equal to <math class="inline">0</math>  when <math class="inline">\mathbf{Y}=a_{0}\mathbf{X}</math> .
 
----
 
----
 
[[ECE600|Back to ECE600]]
 
[[ECE600|Back to ECE600]]
  
 
[[ECE 600 Prerequisites|Back to ECE 600 Prerequisites]]
 
[[ECE 600 Prerequisites|Back to ECE 600 Prerequisites]]

Latest revision as of 10:31, 30 November 2010

1.10 Two Random Variables

From the ECE600 Pre-requisites notes of Sangchun Han, ECE PhD student.


Fact

Let $ \mathbf{X} $ and $ \mathbf{Y} $ be two jointly-distributed, statistically independent random variables, having pdfs $ f_{\mathbf{X}}\left(x\right) $ and $ f_{\mathbf{Y}}\left(y\right) $ . Then the pdf of $ \mathbf{Z}=\mathbf{X}+\mathbf{Y} $ is

$ f_{\mathbf{Z}}\left(z\right)=\left(f_{\mathbf{X}}*f_{\mathbf{Y}}\right)\left(z\right)=\int_{-\infty}^{\infty}f_{\mathbf{X}}\left(x\right)f_{\mathbf{Y}}\left(z-x\right)dx=\int_{-\infty}^{\infty}f_{\mathbf{Y}}\left(y\right)f_{\mathbf{X}}\left(z-y\right)dy $.

The discrete version of above equation is

$ P\left(\mathbf{Z}=z\right)=\sum_{k=-\infty}^{\infty}P\left(\mathbf{X}=k\right)P\left(\mathbf{Y}=z-k\right)=\sum_{k=-\infty}^{\infty}P\left(\mathbf{X}=z-k\right)P\left(\mathbf{Y}=k\right) $.

Example. Sum of two exponential random variables

Let $ \mathbf{X} $ and $ \mathbf{Y} $ be two jointly-distributed RVs having exponential distributions with mean $ \mu $ . Let $ \mathbf{Z}=\mathbf{X}+\mathbf{Y} $ . Find $ f_{\mathbf{Z}}\left(z\right) $ .

$ f_{\mathbf{Z}}\left(z\right) $

Joint Gaussian pdf

$ f_{\mathbf{XY}}(x,y)=\frac{1}{2\pi\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}\sqrt{1-r^{2}}}\exp\left\{ \frac{-1}{2\left(1-r^{2}\right)}\left[\frac{\left(x-\mu_{\mathbf{X}}\right)^{2}}{\sigma_{\mathbf{X}}^{2}}-\frac{2r\left(x-\mu_{\mathbf{X}}\right)\left(y-\mu_{\mathbf{Y}}\right)}{\sigma_{\mathbf{X}}\sigma_{\mathbf{Y}}}+\frac{\left(y-\mu_{\mathbf{Y}}\right)^{2}}{\sigma_{\mathbf{Y}}^{2}}\right]\right\} $

If $ r=0 $ ($ \mathbf{X} $ and $ \mathbf{Y} $ are uncorrelated),

$ f_{\mathbf{XY}}(x,y) $

$ \Longrightarrow\mathbf{X} $ and $ \mathbf{Y} $ are independent.

If two RVs $ \mathbf{X} $ and $ \mathbf{Y} $ are uncorrelated Gaussian, then $ \mathbf{X} $ and $ \mathbf{Y} $ are independent. This is the exception of “$ \mathbf{X} $ and $ \mathbf{Y} $ are independent $ \left(\nLeftarrow\right)\Rightarrow \mathbf{X} $ and $ \mathbf{Y} $ are uncorrelated”.

Definition. Uncorrelation

Two RVs \mathbf{X} and \mathbf{Y} are uncorrelated if their covariance is equal to zero. This is true if any of the following three equivalent condition is true:

1. $ Cov\left(\mathbf{X},\mathbf{Y}\right)=0 $

2. $ r_{\mathbf{XY}}=0 $

3. $ E\left[\mathbf{XY}\right]=E\left[\mathbf{X}\right]E\left[\mathbf{Y}\right] $

Note

$ \mathbf{X} $ and $ \mathbf{Y} $ are uncorrelated $ \Longleftrightarrow E\left[\mathbf{XY}\right]=E\left[\mathbf{X}\right]\cdot E\left[\mathbf{Y}\right]\Longleftrightarrow Cov\left(\mathbf{X},\mathbf{Y}\right)=0 $

$ \mathbf{X} $ and $ \mathbf{Y} $ are independent $ \Longleftrightarrow f_{\mathbf{XY}}(x,y)=f_{\mathbf{X}}(x)\cdot f_{\mathbf{Y}}(y) $

$ \mathbf{X} $ and $ \mathbf{Y} $ are independent $ \left(\nLeftarrow\right)\Longrightarrow \mathbf{X} $ and $ \mathbf{Y} $ are uncorrelated Definition. Orthogonality

Two RVs are orthogoal if $ E\left[\mathbf{XY}\right]=0 $ .

Fact

If $ E\left[\mathbf{X}^{2}\right]<\infty and E\left[\mathbf{Y}^{2}\right]<\infty , then \left|E\left[\mathbf{XY}\right]\right|\leq\sqrt{E\left[\mathbf{X}\right]^{2}E\left[\mathbf{Y}\right]^{2}} $ with equality iff $ \mathbf{Y}=a_{0}\mathbf{X} $ where $ a_{0} $ is constant.

Recall

For a quadratic equation $ ax^{2}+bx+c=0,\; a\neq0 $ , the discriminant is $ b^{2}-4ac $ .

Proof

$ E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]\geq0\Longrightarrow a^{2}E\left[\mathbf{X}^{2}\right]-2aE\left[\mathbf{XY}\right]+E\left[\mathbf{Y}^{2}\right]\geq0. $

n.b. LHS is a quadratic in a .

Let's consider two cases: $ (i)\; E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]>0 , (ii)\; E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]=0 $ .

(i) $ 0<E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]=a^{2}E\left[\mathbf{X}^{2}\right]-2aE\left[\mathbf{XY}\right]+E\left[\mathbf{Y}^{2}\right] $ $ \Longrightarrow $ quadratic in a has complex roots $ \Longrightarrow $ “discriminant” of this quadratic is negative

$ 4a^{2}E\left[\mathbf{XY}\right]^{2}-4a^{2}E\left[\mathbf{X}^{2}\right]E\left[\mathbf{Y}^{2}\right] $

(ii) $ 0=E\left[\left(a\mathbf{X}-\mathbf{Y}\right)^{2}\right]=a^{2}E\left[\mathbf{X}^{2}\right]-2aE\left[\mathbf{XY}\right]+E\left[\mathbf{Y}^{2}\right] $ for some $ a=a_{0} $ .

n.b. The discriminant is equal to $ 0 $ when $ \mathbf{Y}=a_{0}\mathbf{X} $ .


Back to ECE600

Back to ECE 600 Prerequisites

Alumni Liaison

Sees the importance of signal filtering in medical imaging

Dhruv Lamba, BSEE2010