Line 149: Line 149:
  
 
----
 
----
 +
[[ECE662_topic2_discussions|Back to ECE662_topic2_discussions]]
 +
 
[[ 2010 Spring ECE 662 mboutin|Back to 2010 Spring ECE 662 mboutin]]
 
[[ 2010 Spring ECE 662 mboutin|Back to 2010 Spring ECE 662 mboutin]]
'''Bold text'''
 

Revision as of 09:40, 1 February 2010

Example of Turning Conditional Distributions Around

Let $ \mathbb{X} $ and $ \mathbb{Y} $ be jointly distributed discrete random variables with ranges $ X = \{0, 1, 2, 3, 4\} $ and $ Y = \{0, 1, 2\} $ respectively.

Suppose that the conditional distributions $ P_{\mathbb{X}|\mathbb{Y}} $ are empirically estimated as follows:


$ x $ 0 1 2 3 4
$ P_{\mathbb{X}|\mathbb{Y}}(x|y=0) $ .175 .635 .159 .000 .031


$ x $ 0 1 2 3 4
$ P_{\mathbb{X}|\mathbb{Y}}(x|y=1) $ .048 .000 .143 .238 .571


$ x $ 0 1 2 3 4
$ P_{\mathbb{X}|\mathbb{Y}}(x|y=2) $ .188 .562 .250 .000 .000


and the marginal $ P_{\mathbb{Y}} $ is empirically estimated as:


$ y $ 0 1 2
$ P_{\mathbb{Y}}(y) $ .63 .21 .16

Estimate the conditional distributions $ P_{\mathbb{Y}|\mathbb{X}} $



Solution 1

By definition $ P_{\mathbb{X}|\mathbb{Y}}(x|y) = \frac{P_{\mathbb{X},\mathbb{Y}}(x,y)}{P_{\mathbb{Y}}(y)} $, so the joint distribution $ P_{\mathbb{X},\mathbb{Y}}(x,y) $ can be computed.

$ P_{\mathbb{X},\mathbb{Y}}(0,0) = P_{\mathbb{X}|\mathbb{Y}}(0|0)P_{\mathbb{Y}}(0) = .175 \cdot .63 = .11 $

Computing the rest of the distribution similarly:

$ P_{\mathbb{X},\mathbb{Y}}(x,y) $
0 1 2 3 4
0 .11 .40 .10 .00 .02
1 .01 .00 .03 .05 .12
2 .03 .09 .04 .00 .00

The marginal distribution $ P_\mathbb{X} $ can be extracted from the joint distribution as:

$ P_\mathbb{X}(x) = \sum_{y\in Y} P_{\mathbb{X},\mathbb{Y}}(x,y) $

$ P_\mathbb{X}(0) = .11 + .01 + .03 = .15 $

Computing the rest of the distribution similarly:


$ x $ 0 1 2 3 4
$ P_{\mathbb{X}}(x) $ .15 .49 .17 .05 .14


Finally $ P_{\mathbb{Y}|\mathbb{X}} $ can be computed by definition.

$ P_{\mathbb{Y}|\mathbb{X}}(0|0) = \frac{P_{\mathbb{X},\mathbb{Y}}(0,0)}{P_{\mathbb{X}}(0)} = \frac{.11}{.15} = .733 $

Computing the rest similarly:


$ y $ 0 1 2
$ P_{\mathbb{Y}|\mathbb{X}}(y|x=0) $ .733 .067 .200


$ y $ 0 1 2
$ P_{\mathbb{Y}|\mathbb{X}}(y|x=1) $ .816 .000 .184


$ y $ 0 1 2
$ P_{\mathbb{Y}|\mathbb{X}}(y|x=2) $ .588 .176 .236


$ y $ 0 1 2
$ P_{\mathbb{Y}|\mathbb{X}}(y|x=3) $ .000 1.00 .000


$ y $ 0 1 2
$ P_{\mathbb{Y}|\mathbb{X}}(y|x=4) $ .143 .857 .000


Note from these $ P_{\mathbb{Y}|\mathbb{X}} $ distributions that for large $ x $ it is highly probable that $ y=1 $ and for small $ x $ it is highly probable that $ y=0 $.

--Jvaught 22:34, 29 January 2010 (UTC)


Solution 2

Or, equivalently, we can use Bayes' Rule explicity.

Bayes' Rule is:

$ P_{\mathbb{Y}|\mathbb{X}}(y|x) = \frac{P_{\mathbb{X}|\mathbb{Y}}(x|y)P_{\mathbb{Y}}(y)}{P_{\mathbb{X}}(x)} $

$ P_{\mathbb{X}}(x) $ can be computed using:

$ P_\mathbb{X}(x) = \sum_{y\in Y} P_{\mathbb{X}|\mathbb{Y}}(x|y)P_{\mathbb{Y}}(y) $

Thus, calculation of $ P_{\mathbb{Y}|\mathbb{X}}(Y=0|X=0) $ would proceed as follows:

$ P_{\mathbb{Y}|\mathbb{X}}(Y=0|X=0) = \frac{P_{\mathbb{X}|\mathbb{Y}}(X=0|Y=0)P_{\mathbb{Y}}(Y=0)}{P_{\mathbb{X}}(X=0)} = \frac{P_{\mathbb{X}|\mathbb{Y}}(X=0|Y=0)P_{\mathbb{Y}}(Y=0)}{\sum_{y\in Y} P_{\mathbb{X}|\mathbb{Y}}(X=0|Y=y)P_{\mathbb{Y}}(Y=y)} $

$ = \frac{0.175 \cdot 0.63}{0.175 \cdot 0.63 + 0.048 \cdot 0.21 + 0.188 \cdot 0.16} = \frac{.11025}{.15041} = 0.732996476 \approx 0.733 $

The rest of the conditional distribution can be computed similarly using Bayes' Rule and will result in the same answer as in Solution 1.

  • This solution is nearly identical to Solution 1, differing only in that this solution does not contruct the joint probability mass function.


--Pritchey 14:38, 1 February 2010 (UTC)


Back to ECE662_topic2_discussions

Back to 2010 Spring ECE 662 mboutin

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood