(New page: Category:ECE600 Category:Lecture notes <center><font size= 4> '''Random Variables and Signals''' </font size> <font size= 3> Topic 12: Random Variables: Distributions</font size>...) |
|||
Line 11: | Line 11: | ||
---- | ---- | ||
+ | |||
+ | We have previously defined statistical independence of two events A and b in ''F''. We will now use that definition to define independence of random variables X and y. | ||
+ | |||
+ | '''Definition''' <math>\qquad</math> Two random variables X and Y on (''S,F,''P) are '''statistically independent''' if the events {X ∈ A}, and {Y ∈ B} are independent ∀A,B ∈ ''F''. i.e. <br/> | ||
+ | <center><math>P(\{X\in A\}\cap\{Y\in B\})=P(X\in A)P(Y\in B) \quad\forall A,B\in\mathcal F</math></center> | ||
+ | |||
+ | There is an alternative definition of independence for random variables that is often used. We will show that X and Y are independent iff <br/> | ||
+ | <center><math>f_{XY}(x,y)=f_X(x)f_Y(y)\quad\forall x,y\in\mathbb R</math></center> | ||
+ | |||
+ | |||
+ | First assume that X and Y are independent and let A = (-∞,x], B = (-∞,y]. Then, <br/> | ||
+ | <center><math>\begin{align} | ||
+ | F_{XY}(x,y) &= P(X\leq x,Y\leq y) \\ | ||
+ | &= P(X\in A,Y\in B) \\ | ||
+ | &= P(X\in A)P(Y\in B) \\ | ||
+ | &= P(X\leq x)P(Y\leq y) \\ | ||
+ | &= F_X(x)F_Y(y) \\ | ||
+ | \Rightarrow f_{XY}(x,y) &= f_X(x)f_Y(y) | ||
+ | \end{align}</math></center> | ||
+ | |||
+ | Now assume that f<math>_{XY}</math>(x,y) = f<math>_X</math>(x)f<math>_Y</math>(y) ∀x,y ∈ '''R'''. Then, for any A,B ∈ B('''R''')<br/> | ||
+ | <center><math>\begin{align} | ||
+ | P(X\in A,Y\in B) &= \int_A\int_Bf_{XY}(x,y)dydx \\ | ||
+ | &=\int_A\int_Bf_X(x)f_Y(y)dydx \\ | ||
+ | &=\int_Af_X(x)dx\int_Bf_Y(y)dy \\ | ||
+ | &= P(X\in A)P(Y\in B) | ||
+ | \end{align}</math></center> | ||
+ | |||
+ | Thus, X and Y are inedependent iff f<math>_{XY}</math>(x,y) = f<math>_X</math>f)X<math>_Y</math>. | ||
+ | |||
+ | |||
+ | ---- | ||
+ | |||
+ | == References == | ||
+ | |||
+ | * [https://engineering.purdue.edu/~comerm/ M. Comer]. ECE 600. Class Lecture. [https://engineering.purdue.edu/~comerm/600 Random Variables and Signals]. Faculty of Electrical Engineering, Purdue University. Fall 2013. | ||
+ | |||
+ | |||
+ | ---- | ||
+ | |||
+ | ==[[Talk:ECE600_F13_Independent_Random_Variables_mhossain|Questions and comments]]== | ||
+ | |||
+ | If you have any questions, comments, etc. please post them on [[Talk:ECE600_F13_Independent_Random_Variables_mhossain|this page]] | ||
+ | |||
+ | |||
+ | ---- | ||
+ | |||
+ | [[ECE600_F13_notes_mhossain|Back to all ECE 600 notes]] |
Revision as of 21:31, 3 November 2013
Random Variables and Signals
Topic 12: Random Variables: Distributions
We have previously defined statistical independence of two events A and b in F. We will now use that definition to define independence of random variables X and y.
Definition $ \qquad $ Two random variables X and Y on (S,F,P) are statistically independent if the events {X ∈ A}, and {Y ∈ B} are independent ∀A,B ∈ F. i.e.
There is an alternative definition of independence for random variables that is often used. We will show that X and Y are independent iff
First assume that X and Y are independent and let A = (-∞,x], B = (-∞,y]. Then,
Now assume that f$ _{XY} $(x,y) = f$ _X $(x)f$ _Y $(y) ∀x,y ∈ R. Then, for any A,B ∈ B(R)
Thus, X and Y are inedependent iff f$ _{XY} $(x,y) = f$ _X $f)X$ _Y $.
References
- M. Comer. ECE 600. Class Lecture. Random Variables and Signals. Faculty of Electrical Engineering, Purdue University. Fall 2013.
Questions and comments
If you have any questions, comments, etc. please post them on this page