Revision as of 14:09, 12 November 2013 by Mhossain (Talk | contribs)

Back to all ECE 600 notes


Random Variables and Signals

Topic 12: Independent Random Variables



We have previously defined statistical independence of two events A and b in F. We will now use that definition to define independence of random variables X and Y.

Definition $ \qquad $ Two random variables X and Y on (S,F,P) are statistically independent if the events {X ∈ A}, and {Y ∈ B} are independent ∀A,B ∈ F. i.e.

$ P(\{X\in A\}\cap\{Y\in B\})=P(X\in A)P(Y\in B) \quad\forall A,B\in\mathcal F $

There is an alternative definition of independence for random variables that is often used. We will show that X and Y are independent iff

$ f_{XY}(x,y)=f_X(x)f_Y(y)\quad\forall x,y\in\mathbb R $


First assume that X and Y are independent and let A = (-∞,x], B = (-∞,y]. Then,

$ \begin{align} F_{XY}(x,y) &= P(X\leq x,Y\leq y) \\ &= P(X\in A,Y\in B) \\ &= P(X\in A)P(Y\in B) \\ &= P(X\leq x)P(Y\leq y) \\ &= F_X(x)F_Y(y) \\ \Rightarrow f_{XY}(x,y) &= f_X(x)f_Y(y) \end{align} $

Now assume that f$ _{XY} $(x,y) = f$ _X $(x)f$ _Y $(y) ∀x,y ∈ R. Then, for any A,B ∈ B(R)

$ \begin{align} P(X\in A,Y\in B) &= \int_A\int_Bf_{XY}(x,y)dydx \\ &=\int_A\int_Bf_X(x)f_Y(y)dydx \\ &=\int_Af_X(x)dx\int_Bf_Y(y)dy \\ &= P(X\in A)P(Y\in B) \end{align} $

Thus, X and Y are independent iff f$ _{XY} $(x,y) = f$ _X $(x)f$ _Y $(y).



References



Questions and comments

If you have any questions, comments, etc. please post them on this page



Back to all ECE 600 notes

Alumni Liaison

Abstract algebra continues the conceptual developments of linear algebra, on an even grander scale.

Dr. Paul Garrett