(19 intermediate revisions by 2 users not shown) | |||
Line 21: | Line 21: | ||
</center> | </center> | ||
---- | ---- | ||
+ | ===Solution 1=== | ||
First of all, the conditional distribution can be written as: | First of all, the conditional distribution can be written as: | ||
Line 34: | Line 35: | ||
P(X=x, Y=n-x) | P(X=x, Y=n-x) | ||
=P(X=x)P(Y=n-x)\\ | =P(X=x)P(Y=n-x)\\ | ||
− | =\frac{e^{-\lambda_1}\lambda^x}{x!}\times \frac{e^{-\lambda_2}\lambda^(n-x)}{(n-x)!} | + | =\frac{e^{-\lambda_1}\lambda^x}{x!}\times \frac{e^{-\lambda_2}\lambda^(n-x)}{(n-x)!}\\ |
=\frac{e^{-(\lambda_1+\lambda_2)}}{x!} | =\frac{e^{-(\lambda_1+\lambda_2)}}{x!} | ||
\left( | \left( | ||
Line 47: | Line 48: | ||
<math> | <math> | ||
− | + | P(X+Y=n) | |
− | ={\sum_{k=0}^{k=n}P(X=k,Y=n-k)} | + | ={\sum_{k=0}^{k=n}P(X=k,Y=n-k)} |
={\sum_{k=0}^{k=n}P(X=k)P(Y=n-k)}\\ | ={\sum_{k=0}^{k=n}P(X=k)P(Y=n-k)}\\ | ||
=\frac{e^{-(\lambda_1+\lambda_2)}}{n!}\sum_{k=0}^{k=n} | =\frac{e^{-(\lambda_1+\lambda_2)}}{n!}\sum_{k=0}^{k=n} | ||
Line 56: | Line 57: | ||
\end{array} | \end{array} | ||
\right) | \right) | ||
− | \lambda_1^k\lambda_2^{n-k} | + | \lambda_1^k\lambda_2^{n-k}\\ |
− | + | =\frac{e^{-(\lambda_1+\lambda_2)}}{n!}(\lambda_1+\lambda_2)^n | |
</math> | </math> | ||
Line 69: | Line 70: | ||
\right) | \right) | ||
(\frac{\lambda_1}{\lambda_1+\lambda_2})^x(\frac{\lambda_2}{\lambda_1+\lambda_2})^{n-x} | (\frac{\lambda_1}{\lambda_1+\lambda_2})^x(\frac{\lambda_2}{\lambda_1+\lambda_2})^{n-x} | ||
− | |||
</math> | </math> | ||
+ | |||
+ | ===Solution 2=== | ||
+ | Let <math>Z=X+Y</math>, | ||
+ | |||
+ | <math>P_Z(k)=P_Z(z=k)=P_Z(x+y=k)\\ | ||
+ | =\sum_{i=0}^{k}P(x=i)(y=k-i)=\sum_{i=0}^{k}\frac{\lambda_1^i e^{\lambda_1}}{i!}\cdot\frac{\lambda_2^{k-i}e^{-\lambda_2}}{(k-i)!} | ||
+ | =e^{-\lambda_1-\lambda_2}\cdot \frac{(\lambda_1+\lambda_2)^k}{k!} | ||
+ | </math> | ||
+ | |||
+ | Using <math>(a+b)^k=\sum_{i=0}^{k}a^ib^{(k-i)}\cdot \frac{k!}{i!(k-i)!} </math> | ||
+ | |||
+ | Therefore, | ||
+ | |||
+ | <math> | ||
+ | P_{X|Z}(x=k|z=n)=\frac{P(x=k,z=n)}{P(z=n)}=\frac{P(x=k,x+y=n)}{P(z=n)}=\frac{P(x=k,y=n-k)}{P(z=n)}\\ | ||
+ | =\frac{\lambda_1^k e^{-\lambda_1}}{k!}\frac{\lambda_2^{n-k} e^{-\lambda_2}}{(n-k)!}\frac{n!}{e^{-\lambda_1}e^{-\lambda_2}(\lambda_1+\lambda_2)^n} | ||
+ | = \frac{\lambda_1^k \lambda_2^{n-k} }{(\lambda_1+\lambda_2)^n}\frac{n!}{k!(n-k)!} | ||
+ | </math> | ||
+ | |||
+ | ===Solution 3=== | ||
+ | |||
+ | We will view this problem through the lens of Bayes' Theorem. As such, we can write the conditional distribution as | ||
+ | |||
+ | <math> | ||
+ | P(X = x | X+Y = n) = \frac{P(X = x, X+Y = n)}{P(X+Y = n)} = \frac{P(X = x, Y = n - X)}{\sum^n_{k = 0}P(X = k, Y = n - k)} | ||
+ | </math>. | ||
+ | |||
+ | Since <math>X</math> and <math>Y</math> are independent, we can further write | ||
+ | |||
+ | <math> | ||
+ | P(X = x | X+Y = n) = \frac{P(X = x)P(Y = n - X)}{\sum^n_{k = 0}(P(X=k)P(Y = n-k))} | ||
+ | </math>. | ||
+ | |||
+ | Now let us separate the above expression into numerator and denominator. Recalling that <math>X</math> and <math>Y</math> are independent Poisson r.v.s, the numerator is given by | ||
+ | |||
+ | <math> | ||
+ | P(X = x)P(Y = n - X) = \frac{e^{-\lambda_1}\lambda_1^x}{x!}\cdot\frac{e^{-\lambda_2}\lambda_2^{n-x}}{(n-x)!} | ||
+ | </math>. | ||
+ | |||
+ | Multiplying the above by <math>\frac{n!}{n!}</math> gives | ||
+ | |||
+ | <math> | ||
+ | P(X = x)P(Y = n - X) = \frac{e^{-\lambda_1 + \lambda_2}}{n!}{n\choose x}\lambda_1^x\lambda_2^{n-x} | ||
+ | </math>. | ||
+ | |||
+ | Now let us examine the denominator. Again, we make use of the fact that <math>X</math> and <math>Y</math> are independent Poisson r.v.s to write | ||
+ | |||
+ | <math> | ||
+ | \sum^n_{k = 0}(P(X=k)P(Y = n-k)) = \sum^n_{k = 0}\left(\frac{e^{-\lambda_1}\lambda_1^k}{k!}\frac{e^{-\lambda_2}\lambda_2^{n-k}}{(n-k)!}\right) | ||
+ | </math>. | ||
+ | |||
+ | Again, we multiply by <math>\frac{n!}{n!}</math> to obtain | ||
+ | |||
+ | <math> | ||
+ | \sum^n_{k = 0}(P(X=k)P(Y = n-k)) = \frac{e^{-\lambda_1 + \lambda_2}}{n!}\sum^n_{k = 0}{n\choose k}\lambda_1^k\lambda_2^{n-k} | ||
+ | </math>. | ||
+ | |||
+ | We can make use of the binomial formula to simplify this expression. Recall that the binomial formula is given by | ||
+ | |||
+ | <math> | ||
+ | (a+b)^n = \sum^n_{k = 0}{n\choose k}a^k b^{n - k} | ||
+ | </math>. | ||
+ | |||
+ | We use this to write | ||
+ | |||
+ | <math>. | ||
+ | \sum^n_{k = 0}(P(X=k)P(Y = n-k)) = \frac{e^{-\lambda_1 + \lambda_2}}{n!}\cdot(\lambda_1 + \lambda_2)^n | ||
+ | </math> | ||
+ | |||
+ | Putting this all together, we can finally write | ||
+ | |||
+ | <math> | ||
+ | P(X = x | X+Y = n) = \frac{\frac{e^{-\lambda_1 + \lambda_2}}{n!}{n\choose x}\lambda_1^x\lambda_2^{n-x}}{\frac{e^{-\lambda_1 + \lambda_2}}{n!}\cdot(\lambda_1 + \lambda_2)^n} = {n\choose x}\frac{\lambda_1^x\lambda_2^{n-x}}{(\lambda_1 + \lambda_2)^n} | ||
+ | </math> | ||
+ | |||
+ | and we are done. | ||
+ | |||
+ | ===Similar Problem=== | ||
+ | |||
+ | If <math>X</math> and <math>Y</math> are independent binomial random variables with success probabilities <math>p</math> and <math>q</math> respectively, find the probability mass function of <math>X</math> when <math>X + Y = k</math>. In addition, investigate what happens to this p.m.f. when <math>p = q</math>. | ||
---- | ---- | ||
[[ECE-QE_CS1-2015|Back to QE CS question 1, August 2015]] | [[ECE-QE_CS1-2015|Back to QE CS question 1, August 2015]] | ||
[[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]] | [[ECE_PhD_Qualifying_Exams|Back to ECE Qualifying Exams (QE) page]] |
Latest revision as of 22:04, 31 January 2016
Communication, Networking, Signal and Image Processing (CS)
Question 1: Probability and Random Processes
August 2015
Solution 1
First of all, the conditional distribution can be written as:
$ P(X=x|X+Y=n) =\frac{P(X=x, X+Y=n)}{P(X+Y=n)} =\frac{P(X=x, Y=n-x)}{P(X+Y=n)} $
And
$ P(X=x, Y=n-x) =P(X=x)P(Y=n-x)\\ =\frac{e^{-\lambda_1}\lambda^x}{x!}\times \frac{e^{-\lambda_2}\lambda^(n-x)}{(n-x)!}\\ =\frac{e^{-(\lambda_1+\lambda_2)}}{x!} \left( \begin{array}{c} n\\x \end{array} \right) \lambda_1^x\lambda_2^{n-x} $
Also
$ P(X+Y=n) ={\sum_{k=0}^{k=n}P(X=k,Y=n-k)} ={\sum_{k=0}^{k=n}P(X=k)P(Y=n-k)}\\ =\frac{e^{-(\lambda_1+\lambda_2)}}{n!}\sum_{k=0}^{k=n} \left( \begin{array}{c} n\\k \end{array} \right) \lambda_1^k\lambda_2^{n-k}\\ =\frac{e^{-(\lambda_1+\lambda_2)}}{n!}(\lambda_1+\lambda_2)^n $
So, we get $ P(X=x|X+Y=n) = \left( \begin{array}{c} n\\k \end{array} \right) (\frac{\lambda_1}{\lambda_1+\lambda_2})^x(\frac{\lambda_2}{\lambda_1+\lambda_2})^{n-x} $
Solution 2
Let $ Z=X+Y $,
$ P_Z(k)=P_Z(z=k)=P_Z(x+y=k)\\ =\sum_{i=0}^{k}P(x=i)(y=k-i)=\sum_{i=0}^{k}\frac{\lambda_1^i e^{\lambda_1}}{i!}\cdot\frac{\lambda_2^{k-i}e^{-\lambda_2}}{(k-i)!} =e^{-\lambda_1-\lambda_2}\cdot \frac{(\lambda_1+\lambda_2)^k}{k!} $
Using $ (a+b)^k=\sum_{i=0}^{k}a^ib^{(k-i)}\cdot \frac{k!}{i!(k-i)!} $
Therefore,
$ P_{X|Z}(x=k|z=n)=\frac{P(x=k,z=n)}{P(z=n)}=\frac{P(x=k,x+y=n)}{P(z=n)}=\frac{P(x=k,y=n-k)}{P(z=n)}\\ =\frac{\lambda_1^k e^{-\lambda_1}}{k!}\frac{\lambda_2^{n-k} e^{-\lambda_2}}{(n-k)!}\frac{n!}{e^{-\lambda_1}e^{-\lambda_2}(\lambda_1+\lambda_2)^n} = \frac{\lambda_1^k \lambda_2^{n-k} }{(\lambda_1+\lambda_2)^n}\frac{n!}{k!(n-k)!} $
Solution 3
We will view this problem through the lens of Bayes' Theorem. As such, we can write the conditional distribution as
$ P(X = x | X+Y = n) = \frac{P(X = x, X+Y = n)}{P(X+Y = n)} = \frac{P(X = x, Y = n - X)}{\sum^n_{k = 0}P(X = k, Y = n - k)} $.
Since $ X $ and $ Y $ are independent, we can further write
$ P(X = x | X+Y = n) = \frac{P(X = x)P(Y = n - X)}{\sum^n_{k = 0}(P(X=k)P(Y = n-k))} $.
Now let us separate the above expression into numerator and denominator. Recalling that $ X $ and $ Y $ are independent Poisson r.v.s, the numerator is given by
$ P(X = x)P(Y = n - X) = \frac{e^{-\lambda_1}\lambda_1^x}{x!}\cdot\frac{e^{-\lambda_2}\lambda_2^{n-x}}{(n-x)!} $.
Multiplying the above by $ \frac{n!}{n!} $ gives
$ P(X = x)P(Y = n - X) = \frac{e^{-\lambda_1 + \lambda_2}}{n!}{n\choose x}\lambda_1^x\lambda_2^{n-x} $.
Now let us examine the denominator. Again, we make use of the fact that $ X $ and $ Y $ are independent Poisson r.v.s to write
$ \sum^n_{k = 0}(P(X=k)P(Y = n-k)) = \sum^n_{k = 0}\left(\frac{e^{-\lambda_1}\lambda_1^k}{k!}\frac{e^{-\lambda_2}\lambda_2^{n-k}}{(n-k)!}\right) $.
Again, we multiply by $ \frac{n!}{n!} $ to obtain
$ \sum^n_{k = 0}(P(X=k)P(Y = n-k)) = \frac{e^{-\lambda_1 + \lambda_2}}{n!}\sum^n_{k = 0}{n\choose k}\lambda_1^k\lambda_2^{n-k} $.
We can make use of the binomial formula to simplify this expression. Recall that the binomial formula is given by
$ (a+b)^n = \sum^n_{k = 0}{n\choose k}a^k b^{n - k} $.
We use this to write
$ . \sum^n_{k = 0}(P(X=k)P(Y = n-k)) = \frac{e^{-\lambda_1 + \lambda_2}}{n!}\cdot(\lambda_1 + \lambda_2)^n $
Putting this all together, we can finally write
$ P(X = x | X+Y = n) = \frac{\frac{e^{-\lambda_1 + \lambda_2}}{n!}{n\choose x}\lambda_1^x\lambda_2^{n-x}}{\frac{e^{-\lambda_1 + \lambda_2}}{n!}\cdot(\lambda_1 + \lambda_2)^n} = {n\choose x}\frac{\lambda_1^x\lambda_2^{n-x}}{(\lambda_1 + \lambda_2)^n} $
and we are done.
Similar Problem
If $ X $ and $ Y $ are independent binomial random variables with success probabilities $ p $ and $ q $ respectively, find the probability mass function of $ X $ when $ X + Y = k $. In addition, investigate what happens to this p.m.f. when $ p = q $.