Line 3: | Line 3: | ||
<font size="4">[[ECE PhD Qualifying Exams|ECE Ph.D. Qualifying Exam]] </font> | <font size="4">[[ECE PhD Qualifying Exams|ECE Ph.D. Qualifying Exam]] </font> | ||
− | <font size="4">Communication, Networking, Signal and Image Processing (CS)</font> | + | <font size="4">Communication, Networking, Signal and Image Processing (CS)</font> |
<font size="4">Question 1: Probability and Random Processes </font> | <font size="4">Question 1: Probability and Random Processes </font> | ||
Line 13: | Line 13: | ||
---- | ---- | ||
− | Jump to [[ECE-QE CS1-2012 solusion-1|Problem 2]],[[ECE- | + | Jump to [[ECE-QE CS1-2012 solusion-1|Problem 2]],[[ECE-QE_CS1-2012_solusion-2|3]] |
---- | ---- | ||
− | = Problem 3 = | + | = Problem 3 = |
− | ===== Problem statement: ===== | + | ===== Problem statement: ===== |
<br> Let <math>\mathbf{X}_{1} \dots \mathbf{X}_{n} \dots </math> be a sequence of independent, identical distributed random variables, each uniformly distributed on the interval [0, 1], an hence having pdf <br> <math>f_{X}\left(x\right)=\begin{cases} | <br> Let <math>\mathbf{X}_{1} \dots \mathbf{X}_{n} \dots </math> be a sequence of independent, identical distributed random variables, each uniformly distributed on the interval [0, 1], an hence having pdf <br> <math>f_{X}\left(x\right)=\begin{cases} | ||
Line 191: | Line 191: | ||
Thus, the sequence of random variable <span class="texhtml">''Y''<sub>''n''</sub></span> with cumulative distribution function <math>F_{Y_n}(y)</math> converges in distribution to the random variable <span class="texhtml">''Y''</span> with cumulative distribution function <span class="texhtml">''F''<sub>''Y''</sub>(''y'')</span>, which is a step function. | Thus, the sequence of random variable <span class="texhtml">''Y''<sub>''n''</sub></span> with cumulative distribution function <math>F_{Y_n}(y)</math> converges in distribution to the random variable <span class="texhtml">''Y''</span> with cumulative distribution function <span class="texhtml">''F''<sub>''Y''</sub>(''y'')</span>, which is a step function. | ||
+ | <br> | ||
+ | <font color="red">'''<u>Comments on Solution 2:</u>'''</font> | ||
− | <font color = "red"> | + | <font color="red">1. Curly braces are required to take probabilities on events.</font> |
− | 1. | + | <font color="red">2. Error in part (c): As n grows to infinity, 1 - (1-y)<sup>n</sup> is definitely not equal to 1, for y = 0. Technically, that would lead to an indeterminate value at y = 0, or can be thought of to be 0 (as engineering common practice).<sup></sup></font> |
− | |||
− | |||
− | === Related Problems === | + | === Related Problems === |
---- | ---- | ||
Line 215: | Line 215: | ||
'''(c)''' If yes, what is the distribution of the random variable it converges to? | '''(c)''' If yes, what is the distribution of the random variable it converges to? | ||
+ | <br> | ||
+ | |||
+ | 2. Explain what happens in this problem when: | ||
+ | (a) Y<sub>n</sub> = max{X<sub>1</sub>, X<sub>2</sub>, ...., X<sub>n</sub>}. | ||
− | + | (b) X<sub>1</sub>, X<sub>2</sub>, ...., X<sub>n</sub> are exponentially distributed. | |
---- | ---- | ||
[[Category:ECE]] [[Category:QE]] [[Category:CNSIP]] [[Category:Problem_solving]] [[Category:Random_variables]] [[Category:Probability]] | [[Category:ECE]] [[Category:QE]] [[Category:CNSIP]] [[Category:Problem_solving]] [[Category:Random_variables]] [[Category:Probability]] |
Revision as of 08:49, 15 August 2014
Communication, Networking, Signal and Image Processing (CS)
Question 1: Probability and Random Processes
August 2012
Jump to Problem 2,3
Contents
Problem 3
Problem statement:
Let $ \mathbf{X}_{1} \dots \mathbf{X}_{n} \dots $ be a sequence of independent, identical distributed random variables, each uniformly distributed on the interval [0, 1], an hence having pdf
$ f_{X}\left(x\right)=\begin{cases} \begin{array}{lll} 1, \text{ for } 0 \leq x \leq1\\ 0, \text{ elsewhere. } \end{array}\end{cases} $
Let $ \mathbf{Y}_{n} $ be a new random variable defined by
$ \mathbf{Y}_{n} = min \,\{{ \mathbf{X}_1, \mathbf{X}_2, \dots \mathbf{X}_n} \} $
(a) Find the pdf of $ \mathbf{Y}_{n} $.
(b) Does the sequence $ \mathbf{Y}_{n} $ converge in probability?
(c) Does the sequence $ \mathbf{Y}_{n} $ converge in distribution? If yes, specify the cumulative function of the random variable it converges to.
$ \color{blue}\text{Solution 1:} $
$ f_{X}(x) = 1_{[0,1]}(x) = \left\{ \begin{array}{ll} 1 & \quad 0 \leq x \leq 1 \\ 0 & \quad elsewhere \end{array} \right. $
$ F_{X}(x) = \int_0^x f_{X}(\alpha)d\alpha = \int_0^x d\alpha = x $
Yn = m'i'n{X1,....,Xn}
(a)
$ \; \; \; F_{Y_{n}}(y) = P(\{Y \leq y\}) $
$ = P(\{min\{X_{1}, ..., X_{n}\} \leq y\}) $
= 1 − P({X1 > y},....,{Xn > y})
= 1 − P({X1 > y})P({X2 > y})...P({Xn > y})
= 1 − (1 − y)n
$ f_{Y_{n}}(y) = \frac{dF_{Y}(y)}{dy} = n(1-y)^{n-1} 1_{[0, 1]}(y) $
(b)
Yes. It converges in probability.
A random variable converges in probability if
$ P\{\mid Y_{n} - a \mid > \varepsilon\} \rightarrow 0 \;\; as \; n \rightarrow \inf \;\; for \;any\; \varepsilon > 0 $
Then the random variable is said to converge to a.
Consider
$ P\{\mid Y_{n}-0 \mid > \varepsilon\} = P\{Y_{n}>\varepsilon\} $
$ = 1 - P\{Y_{n} \leq \varepsilon\} = 1 - [1-(1-\varepsilon)^n] = (1-\varepsilon)^n $
$ As\; n\rightarrow \inf \; \; \; P\{\mid Y_{n}-0 \mid > \varepsilon\} = 0 $
Since $ 0 < (1-\varepsilon) <1 $
Thus $ Y_{n} \rightarrow 0 $
(c)
From the solution to (a), we have
$ F_{Y_{n}}(y) = \left\{ \begin{array}{ll} 1 - (1-y)^{n} & \quad 0 < y \leq 1 \\ 1 & \quad y > 1 \end{array} \right. $
It converges in distribution since:
$ convergence(probability) \Rightarrow convergence(distribution) $
To find the convergence of CDF,
$ \lim_{n \to \infty} F_{Y_{n}}(y) = \lim_{n \to \infty} 1 - (1-y)^{n} = \left\{ \begin{array}{ll} 0 & \quad y \leq 0 \\ 1 & \quad y > 0 \end{array} \right. $
$ {\color{red} \text{*For part (a), there is a typo for the following equation}} $
$ {\color{red} F_{Y_{n}}(y) = P(\{Y \leq y\}) = P(min\{X_{1}, \dots, X_{n}\} \leq y) \text{, and should be changed to} } $
$ {\color{red} F_{Y_{n}}(y) = P(\{Y_{n} \leq y\}) = P(min\{X_{1}, \dots, X_{n}\} \leq y) } $
$ {\color{red} \text{*For part (b), I would like to add some additional parts to the following equation for better understanding}} $
$ {\color{red} n\rightarrow \inf \; \; \; P\{\mid Y_{n}-0 \mid > \varepsilon\} = 0} $
$ {\color{red} \text{Since } 0 < (1-\varepsilon) <1} $
$ {\color{red} \text{to }} $
$ {\color{red}\text{Since } 0 < (1-\varepsilon) <1 \text{, as } n\rightarrow \infty \Rightarrow \lim_{n \to \infty}(1-\varepsilon)^n = 0 \; \; \; \therefore \lim_{n \to \infty}P\{\mid Y_{n}-0 \mid > \varepsilon\} = \lim_{n \to \infty} (1-\varepsilon)^n = 0} $
$ {\color{red} \text{*For part (c), the CDF of the random variable } \mathbf{Y}_n \text{ is incorrect. It should be}} $
$ {\color{red} F_{Y_n}\left(y\right)=\begin{cases} \begin{array}{lll} 0, y <0 \\ 1-(1-y)^n, 0 \leq y \leq 1 \\ 1, y>1 \end{array}\end{cases} } $
$ {\color{red} \text{Therefore, it should be modified as follow}} $
$ {\color{red} \text{As } n \to \infty \text{,} \,\, F_{Y_n}(y) \to F_{Y}(y)=\begin{cases} \begin{array}{lll} 0, y <0 \\ 1, y \geq 0 \end{array}\end{cases} } $
$ \color{blue}\text{Solution 2:} $
(a)
Since $ \mathbf{Y}_{n} = min \,\{{ \mathbf{X}_1, \mathbf{X}_2, \dots \mathbf{X}_n} \} $, and $ \{{ \mathbf{X}_1, \mathbf{X}_2, \dots \mathbf{X}_n} \} $ are i.i.d R.V.
We can have the CDF of $ \mathbf{Y}_{n} $
$ P(\mathbf{Y}_{n} \leq y) = P(\,\{{ \mathbf{X}_1, \mathbf{X}_2, \dots \mathbf{X}_n} \} \leq y) $
$ = 1-P(\,\{{ \mathbf{X}_1, \mathbf{X}_2, \dots \mathbf{X}_n} \} > y) $
$ = 1-P(\mathbf{X}_1 > y)P(\mathbf{X}_2 > y) \dots P(\mathbf{X}_n > y) $
$ = 1-(1-F_{X_1}(y))(1-F_{X_2}(y)) \dots (1-F_{X_n}(y))=1-(1-F_{X_1}(y))^n $
Since $ \mathbf{Y}_{n} $ is also uniform distributed on the interval [0,1]
When Yn = y < 0, $ P(\{\mathbf{Y}_{n}<0 \}) =0 $ since $ F_{X_1}(y) =0 $
When Yn = y > 1, $ P(\{\mathbf{Y}_{n}>1 \}) =1 $ since $ F_{X_1}(y) =1 $
When $ 0 \leq Y_n = y \leq 1 $
$ F_{X_1}(y)= \int_{0}^{y}f_{X_1}(x)\,dx=\int_{0}^{y}\, 1 \, dx = y $
Thus, the CDF of $ \mathbf{Y}_{n} $ is the following
$ F_{Y_n}\left(y\right)=\begin{cases} \begin{array}{lll} 0, y <0 \\ 1-(1-y)^n, 0 \leq y \leq 1 \\ 1, y>1 \end{array}\end{cases} $
Thus, the pdf of $ \mathbf{Y}_{n} $ is
$ f_{Y_n}\left(y\right)=\begin{cases} \begin{array}{lll} n(1-y)^{n-1}, 0 \leq y \leq 1 \\ 0, \,\,\, \text{elsewhere} \end{array}\end{cases} $
(b)
Let μ = E[Yn] and σ2 = E( | Yn − μ | 2). By Chebyshev inequality, we have
$ P(|\mathbf{Y}_{n}-\mu| \geq \varepsilon) \leq \frac{\sigma^2}{\varepsilon^2} = \frac{E[|\mathbf{Y}_{n}-\mu|^2]}{\varepsilon^2} $
Now, we examine
$ E[|\mathbf{Y}_{n}-\mu|^2] = E[Y_n^2-2 \mu Y_n+\mu^2] = E[Y_n^2]-2\mu E[Y_n]+\mu^2=E[Y_n^2]-\mu^2. $
In problem (a), we have the pdf of Yn, we can find E[Yn] and $ E[Y_n^2] $
$ E[Y_n] = \int_{0}^{1}y f_{Y_n}(y) \,dy = \int_{0}^{1} y n(1-y)^{n-1} \, dy = \frac{1}{n+1} $
$ E[Y_n^2] = \int_{0}^{1}y^2 f_{Y_n}(y) \,dy = \int_{0}^{1} y^2 n(1-y)^{n-1} \, dy = \frac{2}{n^2+3n+2} $
As $ n \to \infty $
$ \lim_{n \to \infty} E[|\mathbf{Y}_{n}-\mu|^2] = \lim_{n \to \infty} \left( E[Y_n^2]-(E[Y_n])^2 \right)= \lim_{n \to \infty}\left( \frac{2}{n^2+3n+2}- (\frac{1}{n+1})^2\right)=0 $
$ \Rightarrow P(|\mathbf{Y}_{n}-\mu| \geq \varepsilon) \leq \frac{E[|\mathbf{Y}_{n}-\mu|^2]}{\varepsilon^2} = 0 \,\,\text{as} \,n \to \infty $
Thus, the sequence Yn converges in probability to μ for any $ \varepsilon>0 $, and
$ \lim_{n \to \infty} \mu = \lim_{n \to \infty} E[Y_{n}] = \lim_{n \to \infty} \frac{1}{n+1} =0 $
$ \therefore P(|\mathbf{Y}_{n}-\mu| \geq \varepsilon) = P(|\mathbf{Y}_{n}-0| \geq \varepsilon) \to 0 \,\,\text{as} \,\, n \to \infty $
(c)
The CDF of Yn
$ F_{Y_n}\left(y\right)=\begin{cases} \begin{array}{lll} 0, y <0 \\ 1-(1-y)^n, 0 \leq y \leq 1 \\ 1, y>1 \end{array}\end{cases} $
As $ n \to \infty $
$ F_{Y_n}(y) \to F_{Y}(y)= \begin{cases} \begin{array}{lll} 0, \,\, y <0 \\ 1, \,\, y \geq 0 \end{array}\end{cases} $
where $ 1-(1-y)^n \to 1 $ as $ n \to \infty $, when 0 < y < 1
Because 0 < (1 − y) < 1, $ (1-y)^n \to 0 $, when $ n \to \infty $
$ F_{Y}(y)= \begin{cases} \begin{array}{lll} 0, \,\, y <0 \\ 1, \,\, y \geq 0 \end{array}\end{cases} $
where this CDF is a step function.
Thus, the sequence of random variable Yn with cumulative distribution function $ F_{Y_n}(y) $ converges in distribution to the random variable Y with cumulative distribution function FY(y), which is a step function.
Comments on Solution 2:
1. Curly braces are required to take probabilities on events.
2. Error in part (c): As n grows to infinity, 1 - (1-y)n is definitely not equal to 1, for y = 0. Technically, that would lead to an indeterminate value at y = 0, or can be thought of to be 0 (as engineering common practice).
Related Problems
1. Let Xn be a sequence of independent, identical distributed random variables, each has Cauchy pdf
$ f(x)=\frac{1}{\pi (1+x^2)} \text{, } -\infty < x < \infty $
Let Yn a new random variable defined by
$ Y_{n} = \frac{1}{n} \sum_{i=1}^{n} \, X_{i} $
(a) Find the pdf of the sequence Yn, and describe how it depends on n.
(b) Does the sequence $ \mathbf{Y}_{n} $ converge in distribution?
(c) If yes, what is the distribution of the random variable it converges to?
2. Explain what happens in this problem when:
(a) Yn = max{X1, X2, ...., Xn}.
(b) X1, X2, ...., Xn are exponentially distributed.