(New page: Category:ECE600 Category:Lecture notes <center><font size= 4> '''Random Variables and Signals''' </font size> <font size= 3> Topic 7: Random Variables: Conditional Distributions<...)
 
m (Protected "ECE600 F13 rv conditional distribution mhossain" [edit=sysop:move=sysop])
 
(17 intermediate revisions by 2 users not shown)
Line 1: Line 1:
 
[[Category:ECE600]]
 
[[Category:ECE600]]
 
[[Category:Lecture notes]]
 
[[Category:Lecture notes]]
 +
 +
[[ECE600_F13_notes_mhossain|Back to all ECE 600 notes]]<br/>
 +
[[ECE600_F13_rv_distribution_mhossain|Previous Topic: Random Variables: Distributions]]<br/>
 +
[[ECE600_F13_rv_Functions_of_random_variable_mhossain|Next Topic: Functions of a Random Variable]]
 +
----
 +
[[Category:ECE600]]
 +
[[Category:probability]]
 +
[[Category:lecture notes]]
 +
[[Category:slecture]]
  
 
<center><font size= 4>
 
<center><font size= 4>
'''Random Variables and Signals'''
+
[[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']]
 
</font size>
 
</font size>
 +
 +
[https://www.projectrhea.org/learning/slectures.php Slectures] by [[user:Mhossain | Maliha Hossain]]
 +
  
 
<font size= 3> Topic 7: Random Variables: Conditional Distributions</font size>
 
<font size= 3> Topic 7: Random Variables: Conditional Distributions</font size>
 
</center>
 
</center>
 
  
 +
----
 
----
 
----
  
Line 15: Line 27:
  
 
Recall that <br/>
 
Recall that <br/>
<center><math> P(A|B) = \frac{p(A\cap B)}{P(B)}</math></center>
+
<center><math> P(A|B) = \frac{P(A\cap B)}{P(B)}</math></center>
,B ∈ ''F'' with P(B) > 0.
+
∀ A,B ∈ ''F'' with P(B) > 0.
  
 
We will consider this conditional probability when A = {X≤x} for a continuous random variable or A = {X=x} for a discrete random variable.  
 
We will consider this conditional probability when A = {X≤x} for a continuous random variable or A = {X=x} for a discrete random variable.  
Line 28: Line 40:
 
<center><math> P_X(x|B)\equiv P(X=x|B)=\frac{p(\{X=x\}\cap B)}{P(B)}</math></center>
 
<center><math> P_X(x|B)\equiv P(X=x|B)=\frac{p(\{X=x\}\cap B)}{P(B)}</math></center>
 
∀x ∈ ''R'', for a given B ∈ ''F''. <br/>
 
∀x ∈ ''R'', for a given B ∈ ''F''. <br/>
The function <math>p_x</math> is the conditional pmf of x. [[ECE600_F13_Conditional_probability_mhossain|Recall Bayes' theorem and the Total Probability Law]]:<br/>
+
The function p<math>_X</math> is the conditional pmf of X. Recall  [[ECE600_F13_Conditional_probability_mhossain|Bayes' theorem and the Total Probability Law]]:<br/>
 
<center><math> P(A|B)=\frac{P(B|A)P(A)}{P(B)};\quad P(B), P(A)>0</math></center>
 
<center><math> P(A|B)=\frac{P(B|A)P(A)}{P(B)};\quad P(B), P(A)>0</math></center>
 
and <br/>
 
and <br/>
Line 36: Line 48:
 
In the case A = {X=x}, we get <br/>
 
In the case A = {X=x}, we get <br/>
 
<center><math>p_X(x|B) = \frac{P(B|X=x)p_X(x)}{P(B)}</math></center>
 
<center><math>p_X(x|B) = \frac{P(B|X=x)p_X(x)}{P(B)}</math></center>
where <math>p_X(x|B)</math> is the conditional pmf of X given B and <math>p_X(x)</math> is the pmf of X.
+
where p<math>_X</math>(x|B) is the conditional pmf of X given B and <math>p_X(x)</math> is the pmf of X. Note that Bayes' Theorem in this context requires not only that P(B) >0 but also that P(X = x) > 0.
  
 
We also can use the TPL to get <br/>
 
We also can use the TPL to get <br/>
Line 46: Line 58:
 
==Continuous X==
 
==Continuous X==
  
Let A = {X≤x}. Then if P(B)>0, B ∈ ''F'', definr <br/>
+
Let A = {X≤x}. Then if P(B)>0, B ∈ ''F'', define <br/>
 
<center><math>F_X(x|B)\equiv P(X\leq x|B) = \frac{P(\{X\leq x\}\cap B)}{P(B)}</math></center>
 
<center><math>F_X(x|B)\equiv P(X\leq x|B) = \frac{P(\{X\leq x\}\cap B)}{P(B)}</math></center>
 
as the conditional cdf of X given B.<br/>
 
as the conditional cdf of X given B.<br/>
Line 54: Line 66:
 
Note that B may be an event involving X. <br/>
 
Note that B may be an event involving X. <br/>
  
'''Example:''' let B = {X≤x} for some ''a'' ''R''. Then <br/>
+
'''Example:''' let B = {X ≤ a} for some a '''R'''. Then <br/>
 
<center><math>F_X(x|B) = \frac{P(\{X\leq x\}\cap\{X\leq a\})}{P(X\leq a)}</math></center>
 
<center><math>F_X(x|B) = \frac{P(\{X\leq x\}\cap\{X\leq a\})}{P(X\leq a)}</math></center>
  
 
Two cases:
 
Two cases:
 +
* Case (i): <math>x > a</math><br/>
 +
<center><math>F_X(x|B) = \frac{P(X\leq a)}{P(X\leq a)} = 1</math></center>
 +
* Case (ii): <math>x < a</math><br/>
 +
<center><math>F_X(x|B) = \frac{P(X\leq x)}{P(X\leq a)} = \frac{F_X(x)}{F_X(a)}</math></center>
  
  
 +
<center>[[Image:fig1_rv__conditional_distributions.png|400px|thumb|left|Fig 1: ''{X ≤ x} ∩ {X ≤ a}'' for the two different cases.]]</center>
  
  
 +
Now,<br/>
 +
<center><math>f_X(x|B) = f_X(x|X\leq a)=\begin{cases}
 +
0 & x>a \\
 +
\frac{f_X(x)}{F_X(a)} & x\leq a
 +
\end{cases}
 +
</math></center>
  
 +
<center>[[Image:fig2_rv__conditional_distributions.png|400px|thumb|left|Fig 2: f<math>_X</math>(x) and f<math>_X</math>(x<math>|</math>X ≤ a).]]</center>
  
 +
 +
Bayes' Theorem for continuous X:<br/>
 +
We can easily see that <br/>
 +
<center><math>F_X(x|B)= \frac{P(B|X\leq x)F_X(x)}{P(B)}</math></center>
 +
from previous version of Bayes' Theorem, and that <br/>
 +
<center><math>F_X(x)=\sum_{i=1}^n F_X(x|A_i)P(A_i)</math></center>
 +
if <math>A_1,...,A_n</math> form a partition of ''S'' and P(<math>A_i</math>) > 0 ∀<math>i</math>, from TPL.<br/>
 +
but what we often want to know is a probability of the type P(''A''|''X''=''x'') for some ''A''∈''F''. We could define this as <br/>
 +
<center><math>P(A|X=x)\equiv\frac{P(A\cap \{X=x\})}{P(X=x)}</math></center>
 +
but the right hand side (rhs) would be 0/0 since ''X'' is continuous. <br/>
 +
Instead, we will use the following definition in this case:<br/>
 +
<center><math>P(A|X=a)\equiv\lim_{\Delta x\rightarrow 0}P(A|x<X\leq x+\Delta x)</math></center>
 +
<center><math>\Delta x > 0 \ </math></center>
 +
using our standard definition of conditional probability for the rhs. This leads to the following derivation:<br/>
 +
<center><math>\begin{align}
 +
P(A|X=x) &= \lim_{\Delta x\rightarrow 0}\frac{P(x<X\leq x+\Delta x|A)P(A)}{P(x<X\leq x+\Delta x)} \\
 +
\\
 +
&= P(A)\lim_{\Delta x\rightarrow 0}\frac{F_X(x+\Delta x|A)-F_X(x|A)}{F_X(x+\Delta x)-F_X(x)} \\
 +
\\
 +
&= P(A)\frac{\lim_{\Delta x\rightarrow 0}\frac{F_X(x+\Delta x|A)-F_X(x|A)}{\Delta x}}{\lim_{\Delta x\rightarrow 0}\frac{F_X(x+\Delta x)-F_X(x)}{\Delta x}}\\
 +
\\
 +
&=P(A)\frac{f_X(x|A)}{f_X(x)}
 +
\end{align}</math></center>
 +
 +
So, <br/>
 +
<center><math>P(A|X=x)=\frac{f_X(x|A)P(A)}{f_X(x)} </math></center>
 +
 +
This is how Bayes' Theorem is normally stated for a continuous random variable X and an event ''A''∈''F'' with P(''A'') > 0.
 +
 +
We will revisit Bayes' Theorem one more time when we discuss [[ECE600_F13_Conditional_Distributions_for_Two_Random_Variables_mhossain| conditional distributions for two random variables]].
  
  
Line 75: Line 129:
 
----
 
----
  
[[ECE600_F13_notes_mhossain|Back to all ECE 600 notes]]
+
==[[Talk:ECE600_F13_rv_conditional_distribution_mhossain|Questions and comments]]==
 +
 
 +
If you have any questions, comments, etc. please post them on [[Talk:ECE600_F13_rv_conditional_distribution_mhossain|this page]]
 +
 
 +
 
 +
----
 +
 
 +
[[ECE600_F13_notes_mhossain|Back to all ECE 600 notes]]<br/>
 +
[[ECE600_F13_rv_distribution_mhossain|Previous Topic: Random Variables: Distributions]]<br/>
 +
[[ECE600_F13_rv_Functions_of_random_variable_mhossain|Next Topic: Functions of a Random Variable]]

Latest revision as of 11:11, 21 May 2014

Back to all ECE 600 notes
Previous Topic: Random Variables: Distributions
Next Topic: Functions of a Random Variable


The Comer Lectures on Random Variables and Signals

Slectures by Maliha Hossain


Topic 7: Random Variables: Conditional Distributions



We will now learn how to represent conditional probabilities using the cdf/pdf/pmf. This will provide us some of the most powerful tools for working with random variables: the conditional pdf and conditional pmf.

Recall that

$ P(A|B) = \frac{P(A\cap B)}{P(B)} $

∀ A,B ∈ F with P(B) > 0.

We will consider this conditional probability when A = {X≤x} for a continuous random variable or A = {X=x} for a discrete random variable.



Discrete X

If P(B)>0, then let

$ P_X(x|B)\equiv P(X=x|B)=\frac{p(\{X=x\}\cap B)}{P(B)} $

∀x ∈ R, for a given B ∈ F.
The function p$ _X $ is the conditional pmf of X. Recall Bayes' theorem and the Total Probability Law:

$ P(A|B)=\frac{P(B|A)P(A)}{P(B)};\quad P(B), P(A)>0 $

and

$ P(B)=\sum_{i = 1}^nP(B|A_i)P(A_i) $

if $ A_1,...,A_n $ form a partition of S and $ P(A_i)>0 $ ∀i.

In the case A = {X=x}, we get

$ p_X(x|B) = \frac{P(B|X=x)p_X(x)}{P(B)} $

where p$ _X $(x|B) is the conditional pmf of X given B and $ p_X(x) $ is the pmf of X. Note that Bayes' Theorem in this context requires not only that P(B) >0 but also that P(X = x) > 0.

We also can use the TPL to get

$ p_X(x) = \sum_{i=1}^n p_X(x|A_i)P(A_i) $



Continuous X

Let A = {X≤x}. Then if P(B)>0, B ∈ F, define

$ F_X(x|B)\equiv P(X\leq x|B) = \frac{P(\{X\leq x\}\cap B)}{P(B)} $

as the conditional cdf of X given B.
The conditional pdf of X given B is then

$ f_X(x|B) = \frac{d}{dx}F_X(x|B) $

Note that B may be an event involving X.

Example: let B = {X ≤ a} for some a ∈ R. Then

$ F_X(x|B) = \frac{P(\{X\leq x\}\cap\{X\leq a\})}{P(X\leq a)} $

Two cases:

  • Case (i): $ x > a $
$ F_X(x|B) = \frac{P(X\leq a)}{P(X\leq a)} = 1 $
  • Case (ii): $ x < a $
$ F_X(x|B) = \frac{P(X\leq x)}{P(X\leq a)} = \frac{F_X(x)}{F_X(a)} $


Fig 1: {X ≤ x} ∩ {X ≤ a} for the two different cases.


Now,

$ f_X(x|B) = f_X(x|X\leq a)=\begin{cases} 0 & x>a \\ \frac{f_X(x)}{F_X(a)} & x\leq a \end{cases} $
Fig 2: f$ _X $(x) and f$ _X $(x$ | $X ≤ a).


Bayes' Theorem for continuous X:
We can easily see that

$ F_X(x|B)= \frac{P(B|X\leq x)F_X(x)}{P(B)} $

from previous version of Bayes' Theorem, and that

$ F_X(x)=\sum_{i=1}^n F_X(x|A_i)P(A_i) $

if $ A_1,...,A_n $ form a partition of S and P($ A_i $) > 0 ∀$ i $, from TPL.
but what we often want to know is a probability of the type P(A|X=x) for some AF. We could define this as

$ P(A|X=x)\equiv\frac{P(A\cap \{X=x\})}{P(X=x)} $

but the right hand side (rhs) would be 0/0 since X is continuous.
Instead, we will use the following definition in this case:

$ P(A|X=a)\equiv\lim_{\Delta x\rightarrow 0}P(A|x<X\leq x+\Delta x) $
$ \Delta x > 0 \ $

using our standard definition of conditional probability for the rhs. This leads to the following derivation:

$ \begin{align} P(A|X=x) &= \lim_{\Delta x\rightarrow 0}\frac{P(x<X\leq x+\Delta x|A)P(A)}{P(x<X\leq x+\Delta x)} \\ \\ &= P(A)\lim_{\Delta x\rightarrow 0}\frac{F_X(x+\Delta x|A)-F_X(x|A)}{F_X(x+\Delta x)-F_X(x)} \\ \\ &= P(A)\frac{\lim_{\Delta x\rightarrow 0}\frac{F_X(x+\Delta x|A)-F_X(x|A)}{\Delta x}}{\lim_{\Delta x\rightarrow 0}\frac{F_X(x+\Delta x)-F_X(x)}{\Delta x}}\\ \\ &=P(A)\frac{f_X(x|A)}{f_X(x)} \end{align} $

So,

$ P(A|X=x)=\frac{f_X(x|A)P(A)}{f_X(x)} $

This is how Bayes' Theorem is normally stated for a continuous random variable X and an event AF with P(A) > 0.

We will revisit Bayes' Theorem one more time when we discuss conditional distributions for two random variables.



References



Questions and comments

If you have any questions, comments, etc. please post them on this page



Back to all ECE 600 notes
Previous Topic: Random Variables: Distributions
Next Topic: Functions of a Random Variable

Alumni Liaison

Prof. Math. Ohio State and Associate Dean
Outstanding Alumnus Purdue Math 2008

Jeff McNeal