m (Protected "ECE600 F13 rv conditional distribution mhossain" [edit=sysop:move=sysop]) |
|||
(3 intermediate revisions by 2 users not shown) | |||
Line 3: | Line 3: | ||
[[ECE600_F13_notes_mhossain|Back to all ECE 600 notes]]<br/> | [[ECE600_F13_notes_mhossain|Back to all ECE 600 notes]]<br/> | ||
− | [[Previous Topic: Random Variables: Distributions]]<br/> | + | [[ECE600_F13_rv_distribution_mhossain|Previous Topic: Random Variables: Distributions]]<br/> |
[[ECE600_F13_rv_Functions_of_random_variable_mhossain|Next Topic: Functions of a Random Variable]] | [[ECE600_F13_rv_Functions_of_random_variable_mhossain|Next Topic: Functions of a Random Variable]] | ||
− | + | ---- | |
+ | [[Category:ECE600]] | ||
+ | [[Category:probability]] | ||
+ | [[Category:lecture notes]] | ||
+ | [[Category:slecture]] | ||
<center><font size= 4> | <center><font size= 4> | ||
− | '''Random Variables and Signals''' | + | [[ECE600_F13_notes_mhossain|'''The Comer Lectures on Random Variables and Signals''']] |
</font size> | </font size> | ||
+ | |||
+ | [https://www.projectrhea.org/learning/slectures.php Slectures] by [[user:Mhossain | Maliha Hossain]] | ||
+ | |||
<font size= 3> Topic 7: Random Variables: Conditional Distributions</font size> | <font size= 3> Topic 7: Random Variables: Conditional Distributions</font size> | ||
</center> | </center> | ||
− | |||
---- | ---- | ||
− | + | ---- | |
We will now learn how to represent conditional probabilities using the cdf/pdf/pmf. This will provide us some of the most powerful tools for working with random variables: the conditional pdf and conditional pmf. | We will now learn how to represent conditional probabilities using the cdf/pdf/pmf. This will provide us some of the most powerful tools for working with random variables: the conditional pdf and conditional pmf. | ||
Line 111: | Line 117: | ||
This is how Bayes' Theorem is normally stated for a continuous random variable X and an event ''A''∈''F'' with P(''A'') > 0. | This is how Bayes' Theorem is normally stated for a continuous random variable X and an event ''A''∈''F'' with P(''A'') > 0. | ||
− | We will revisit Bayes' Theorem one more time when we discuss two random variables. | + | We will revisit Bayes' Theorem one more time when we discuss [[ECE600_F13_Conditional_Distributions_for_Two_Random_Variables_mhossain| conditional distributions for two random variables]]. |
Line 131: | Line 137: | ||
[[ECE600_F13_notes_mhossain|Back to all ECE 600 notes]]<br/> | [[ECE600_F13_notes_mhossain|Back to all ECE 600 notes]]<br/> | ||
− | [[Previous Topic: Random Variables: Distributions]]<br/> | + | [[ECE600_F13_rv_distribution_mhossain|Previous Topic: Random Variables: Distributions]]<br/> |
[[ECE600_F13_rv_Functions_of_random_variable_mhossain|Next Topic: Functions of a Random Variable]] | [[ECE600_F13_rv_Functions_of_random_variable_mhossain|Next Topic: Functions of a Random Variable]] |
Latest revision as of 11:11, 21 May 2014
Back to all ECE 600 notes
Previous Topic: Random Variables: Distributions
Next Topic: Functions of a Random Variable
The Comer Lectures on Random Variables and Signals
Topic 7: Random Variables: Conditional Distributions
We will now learn how to represent conditional probabilities using the cdf/pdf/pmf. This will provide us some of the most powerful tools for working with random variables: the conditional pdf and conditional pmf.
Recall that
∀ A,B ∈ F with P(B) > 0.
We will consider this conditional probability when A = {X≤x} for a continuous random variable or A = {X=x} for a discrete random variable.
Discrete X
If P(B)>0, then let
∀x ∈ R, for a given B ∈ F.
The function p$ _X $ is the conditional pmf of X. Recall Bayes' theorem and the Total Probability Law:
and
if $ A_1,...,A_n $ form a partition of S and $ P(A_i)>0 $ ∀i.
In the case A = {X=x}, we get
where p$ _X $(x|B) is the conditional pmf of X given B and $ p_X(x) $ is the pmf of X. Note that Bayes' Theorem in this context requires not only that P(B) >0 but also that P(X = x) > 0.
We also can use the TPL to get
Continuous X
Let A = {X≤x}. Then if P(B)>0, B ∈ F, define
as the conditional cdf of X given B.
The conditional pdf of X given B is then
Note that B may be an event involving X.
Example: let B = {X ≤ a} for some a ∈ R. Then
Two cases:
- Case (i): $ x > a $
- Case (ii): $ x < a $
Now,
Bayes' Theorem for continuous X:
We can easily see that
from previous version of Bayes' Theorem, and that
if $ A_1,...,A_n $ form a partition of S and P($ A_i $) > 0 ∀$ i $, from TPL.
but what we often want to know is a probability of the type P(A|X=x) for some A∈F. We could define this as
but the right hand side (rhs) would be 0/0 since X is continuous.
Instead, we will use the following definition in this case:
using our standard definition of conditional probability for the rhs. This leads to the following derivation:
So,
This is how Bayes' Theorem is normally stated for a continuous random variable X and an event A∈F with P(A) > 0.
We will revisit Bayes' Theorem one more time when we discuss conditional distributions for two random variables.
References
- M. Comer. ECE 600. Class Lecture. Random Variables and Signals. Faculty of Electrical Engineering, Purdue University. Fall 2013.
Questions and comments
If you have any questions, comments, etc. please post them on this page
Back to all ECE 600 notes
Previous Topic: Random Variables: Distributions
Next Topic: Functions of a Random Variable