Example of Turning Conditional Distributions Around

Let $ \mathbb{X} $ and $ \mathbb{Y} $ be jointly distributed discrete random variables with ranges $ X = \{0, 1, 2, 3, 4\} $ and $ Y = \{0, 1, 2\} $ respectively.

Suppose that the conditional distributions $ P_{\mathbb{X}|\mathbb{Y}} $ are empirically estimated as follows:


$ x $ 0 1 2 3 4
$ P_{\mathbb{X}|\mathbb{Y}}(x|y=0) $ .175 .635 .159 .000 .031


$ x $ 0 1 2 3 4
$ P_{\mathbb{X}|\mathbb{Y}}(x|y=1) $ .048 .000 .143 .238 .571


$ x $ 0 1 2 3 4
$ P_{\mathbb{X}|\mathbb{Y}}(x|y=2) $ .188 .562 .250 .000 .000


and the marginal $ P_{\mathbb{Y}} $ is empirically estimated as:


$ y $ 0 1 2
$ P_{\mathbb{Y}}(y) $ .63 .21 .16

Estimate the conditional distributions $ P_{\mathbb{Y}|\mathbb{X}} $



Solution 1

By definition $ P_{\mathbb{X}|\mathbb{Y}}(x|y) = \frac{P_{\mathbb{X},\mathbb{Y}}(x,y)}{P_{\mathbb{Y}}(y)} $, so the joint distribution $ P_{\mathbb{X},\mathbb{Y}}(x,y) $ can be computed.

$ P_{\mathbb{X},\mathbb{Y}}(0,0) = P_{\mathbb{X}|\mathbb{Y}}(0|0)P_{\mathbb{Y}}(0) = .175 \cdot .63 = .11 $

Computing the rest of the distribution similarly:

$ P_{\mathbb{X},\mathbb{Y}}(x,y) $
0 1 2 3 4
0 .11 .40 .10 .00 .02
1 .01 .00 .03 .05 .12
2 .03 .09 .04 .00 .00

The marginal distribution $ P_\mathbb{X} $ can be extracted from the joint distribution as:

$ P_\mathbb{X}(x) = \sum_{y\in Y} P_{\mathbb{X},\mathbb{Y}}(x,y) $

$ P_\mathbb{X}(0) = .11 + .01 + .03 = .15 $

Computing the rest of the distribution similarly:


$ x $ 0 1 2 3 4
$ P_{\mathbb{X}}(x) $ .15 .49 .17 .05 .14


Finally $ P_{\mathbb{Y}|\mathbb{X}} $ can be computed by definition.

$ P_{\mathbb{Y}|\mathbb{X}}(0|0) = \frac{P_{\mathbb{X},\mathbb{Y}}(0,0)}{P_{\mathbb{X}}(0)} = \frac{.11}{.15} = .733 $

Computing the rest similarly:


$ y $ 0 1 2
$ P_{\mathbb{Y}|\mathbb{X}}(y|x=0) $ .733 .067 .200


$ y $ 0 1 2
$ P_{\mathbb{Y}|\mathbb{X}}(y|x=1) $ .816 .000 .184


$ y $ 0 1 2
$ P_{\mathbb{Y}|\mathbb{X}}(y|x=2) $ .588 .176 .236


$ y $ 0 1 2
$ P_{\mathbb{Y}|\mathbb{X}}(y|x=3) $ .000 1.00 .000


$ y $ 0 1 2
$ P_{\mathbb{Y}|\mathbb{X}}(y|x=4) $ .143 .857 .000


Note from these $ P_{\mathbb{Y}|\mathbb{X}} $ distributions that for large $ x $ it is highly probable that $ y=1 $ and for small $ x $ it is highly probable that $ y=0 $.

--Jvaught 22:34, 29 January 2010 (UTC)


Solution 2

Or, equivalently, we can use Bayes' Rule explicity.

Bayes' Rule is:

$ P_{\mathbb{Y}|\mathbb{X}}(y|x) = \frac{P_{\mathbb{X}|\mathbb{Y}}(x|y)P_{\mathbb{Y}}(y)}{P_{\mathbb{X}}(x)} $

$ P_{\mathbb{X}}(x) $ can be computed using:

$ P_\mathbb{X}(x) = \sum_{y\in Y} P_{\mathbb{X}|\mathbb{Y}}(x|y)P_{\mathbb{Y}}(y) $

Thus, calculation of $ P_{\mathbb{Y}|\mathbb{X}}(Y=0|X=0) $ would proceed as follows:

$ P_{\mathbb{Y}|\mathbb{X}}(Y=0|X=0) = \frac{P_{\mathbb{X}|\mathbb{Y}}(X=0|Y=0)P_{\mathbb{Y}}(Y=0)}{P_{\mathbb{X}}(X=0)} = \frac{P_{\mathbb{X}|\mathbb{Y}}(X=0|Y=0)P_{\mathbb{Y}}(Y=0)}{\sum_{y\in Y} P_{\mathbb{X}|\mathbb{Y}}(X=0|Y=y)P_{\mathbb{Y}}(Y=y)} $

$ = \frac{0.175 \cdot 0.63}{0.175 \cdot 0.63 + 0.048 \cdot 0.21 + 0.188 \cdot 0.16} = \frac{.11025}{.15041} = 0.732996476 \approx 0.733 $

The rest of the conditional distribution can be computed similarly using Bayes' Rule and will result in the same answer as in Solution 1.

  • This solution is nearly identical to Solution 1, differing only in that this solution does not contruct the joint probability mass function.


--Pritchey 14:38, 1 February 2010 (UTC)


Cost Trade-offs

The joint $ P_{\mathbb{X},\mathbb{Y}} $ and marginal $ P_\mathbb{X} $ are essentially common sub-expressions of Bayes' formula. Explicitly computing them once and reusing the result as in Solution 1 avoids redundant computations. However, if some of the conditionals $ P_{\mathbb{Y}|\mathbb{X}} $ are not required, Solution 2 avoids unnecessarily computing the entries of $ P_{\mathbb{X},\mathbb{Y}} $ and $ P_\mathbb{X} $ which won't be used. If all $ P_{\mathbb{Y}|\mathbb{X}} $ are required, then all entries of $ P_{\mathbb{X},\mathbb{Y}} $ and $ P_\mathbb{X} $ will be used.


In this example, Solution 1 requires 15 multiplies, 10 adds and 15 divides, whereas Solution 2 requires 60 multiplies, 30 adds, and 15 divides. However, suppose that only $ P_{\mathbb{Y}|\mathbb{X}}(Y|X=0) $ were required. Then Solution 1 would require 15 multiplies, 10 adds, and 3 divides, whereas Solution 2 would only require 12 multiplies, 6 adds, and 3 divides.


In terms of space, Solution 2 has no unnecessary overhead whereas Solution 1 has some. In some cases, the overhead for $ P_{\mathbb{X},\mathbb{Y}} $ can be avoided. Since each $ P_{\mathbb{Y}|\mathbb{X}} $ depends on exactly one entry of $ P_{\mathbb{X},\mathbb{Y}} $, if all $ P_{\mathbb{Y}|\mathbb{X}} $ are required the same memory which will hold $ P_{\mathbb{Y}|\mathbb{X}} $ can be temporarily used to hold $ P_{\mathbb{X},\mathbb{Y}} $. In this case, the only memory overhead of Solution 1 is the space required for $ P_\mathbb{X} $. However, if some of the $ P_{\mathbb{Y}|\mathbb{X}} $ are not required then Solution 1 requires significantly more memory than Solution 2 (unless Solution 1 is modified to only compute the entries of $ P_{\mathbb{X},\mathbb{Y}} $ which are needed for the required $ P_{\mathbb{Y}|\mathbb{X}} $ conditionals).

--Jvaught 08:35, 2 February 2010 (UTC)


Back to ECE662_topic2_discussions

Back to 2010 Spring ECE 662 mboutin

Alumni Liaison

Ph.D. on Applied Mathematics in Aug 2007. Involved on applications of image super-resolution to electron microscopy

Francisco Blanco-Silva