Line 24: Line 24:
  
 
*Continuous case:<br>
 
*Continuous case:<br>
 +
 +
Let X be  a random variable with mean <math>\mu</math> and variance <math>\sigma^2</math>
  
 
Consider two functions <span class="texhtml">''g''<sub>''1''</sub>(''x'')</span> and <span class="texhtml">''g''<sub>''2''</sub>(''x'')</span>
 
Consider two functions <span class="texhtml">''g''<sub>''1''</sub>(''x'')</span> and <span class="texhtml">''g''<sub>''2''</sub>(''x'')</span>
 
<span class="texhtml">''g''<sub>''1''</sub>(''x'')</span> <math>=</math> <span class="texhtml">''1''<sub>''R''</sub>(''x'')</span><br>
 
where, <math>R = \{x: \mid x-\mu\mid \geq \varepsilon\}</math>
 
 
  
 
<math>g_{1}(x)= 1_{\{x: \mid x-\mu\mid \geq \varepsilon\}}(x)</math>
 
<math>g_{1}(x)= 1_{\{x: \mid x-\mu\mid \geq \varepsilon\}}(x)</math>
Line 35: Line 33:
 
<math>g_{2}(x)=\frac{(x-\mu)^2}{\varepsilon^2}</math>
 
<math>g_{2}(x)=\frac{(x-\mu)^2}{\varepsilon^2}</math>
  
 +
Clearly,
 +
 +
<math>g_{2}(x)-g_{1}(x) \geq 0</math>
 +
 +
<math>E[g_{2}(x)-g_{1}(x) ] \geq 0</math>
 +
 +
Consider,
 +
 +
<math>E[g_{2}(x)-g_{1}(x) ] \geq 0 =  E[g_{2}(x)]-E[g_{1}(x)]</math>
 +
 +
where,
 +
<math>E[g_{2}(x)]=E[\frac{(x-\mu)^2}{\varepsilon^2}] = \frac{1}{\varepsilon^2} var(X) = \frac{]sigma^2}{\varepsilon^2}</math>
 +
 +
and
 +
 +
<math>E[g_{1}(x)] = P \{\mid x - \mu\mid \geq \varepsilon \}</math>
 +
 +
Thus we get,
 +
 +
<math>\frac{\sigma^2}{\varepsilon^2} - P\{\mid X - \mu \mid \geq \varepsilon\} \geq 0</math>
 +
 +
Therefore,
 +
 +
<math>P\{\mid X - \mu \mid \geq \varepsilon\} \leq \frac{\sigma^2}{\varepsilon^2}</math>
  
  

Revision as of 12:30, 26 January 2014


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2012



Jump to Problem 2,3


Problem 2

Problem statement: Let X be a continuous or discrete random variable with mean μ and variance σ2. Then, $ \forall \varepsilon >0 $, we have
$ P(|X-\mu| \geq \varepsilon) \leq \frac{\sigma^2}{\varepsilon^2} $

$ \color{blue}\text{Solution 1:} $
  • Continuous case:

Let X be a random variable with mean $ \mu $ and variance $ \sigma^2 $

Consider two functions g1(x) and g2(x)

$ g_{1}(x)= 1_{\{x: \mid x-\mu\mid \geq \varepsilon\}}(x) $

$ g_{2}(x)=\frac{(x-\mu)^2}{\varepsilon^2} $

Clearly,

$ g_{2}(x)-g_{1}(x) \geq 0 $

$ E[g_{2}(x)-g_{1}(x) ] \geq 0 $

Consider,

$ E[g_{2}(x)-g_{1}(x) ] \geq 0 = E[g_{2}(x)]-E[g_{1}(x)] $

where, $ E[g_{2}(x)]=E[\frac{(x-\mu)^2}{\varepsilon^2}] = \frac{1}{\varepsilon^2} var(X) = \frac{]sigma^2}{\varepsilon^2} $

and

$ E[g_{1}(x)] = P \{\mid x - \mu\mid \geq \varepsilon \} $

Thus we get,

$ \frac{\sigma^2}{\varepsilon^2} - P\{\mid X - \mu \mid \geq \varepsilon\} \geq 0 $

Therefore,

$ P\{\mid X - \mu \mid \geq \varepsilon\} \leq \frac{\sigma^2}{\varepsilon^2} $


$ \color{blue}\text{Solution 2:} $
  • Discrete Case:

Let pX(x) be the pmf of X. The probability that X differs from μ by at least $ \varepsilon $ is
$ P(|X-\mu| \geq \varepsilon)= \sum_{|X-\mu| \geq \varepsilon}p_{X}(x) $
Based on the definition of the variance, we have

σ2 = (x − μ)2pX(x)
x


Let a set $ A= \{ x|\,|x-\mu| \geq \varepsilon \} $. We have
$ \sigma^2 = \sum_{x}(x-\mu)^2 p_{X}(x)= \sum_{x \in A}(x-\mu)^2 p_{X}(x)+\sum_{x \notin A}(x-\mu)^2 p_{X}(x) $
$ \Rightarrow\sigma^2 \geq \sum_{x \in A}(x-\mu)^2 p_{X}(x) $
Since, in set A, we have $ |x-\mu| \geq \varepsilon $, we have
$ \Rightarrow\sigma^2 \geq \sum_{x \in A}\varepsilon^2 p_{X}(x)= \varepsilon^2 \sum_{x \in A}p_{X}(x)=\varepsilon^2 P(x \in A) =\varepsilon^2 P(|X-\mu| \geq \varepsilon) $
That is
$ P(|X-\mu| \geq \varepsilon) \leq \frac{\sigma^2}{\varepsilon^2} $

  • Continuous Case:

Let fX(x) be the pdf of X.
$ \sigma^2=\int_{-\infty}^{\infty}(x-\mu)^2f_{X}(x) \,dx \geq \int_{-\infty}^{\mu-\varepsilon}(x-\mu)^2f_{X}(x) \,dx+ \int_{\mu+\varepsilon}^{\infty}(x-\mu)^2f_{X}(x) \,dx $
The last inequality holds since we integrate a positive function. Since $ x \leq \mu-\varepsilon $ or $ x \geq \mu+\varepsilon $
$ \Rightarrow |x-\mu| \geq \varepsilon \Rightarrow (x-\mu)^2 \geq \varepsilon^2 $
Based on the above equation, we have
$ \sigma^2 \geq \int_{-\infty}^{\mu-\varepsilon}\varepsilon^2 f_{X}(x) \,dx+ \int_{\mu+\varepsilon}^{\infty} \varepsilon^2 f_{X}(x) \,dx $
$ = \varepsilon^2 \left( \int_{-\infty}^{\mu-\varepsilon}f_{X}(x) \,dx+ \int_{\mu+\varepsilon}^{\infty} f_{X}(x) \,dx \right) = \varepsilon^2 P \bigg( X \leq (\mu-\varepsilon)\, \text{or} \, X \geq (\mu+\varepsilon) \bigg) = \varepsilon^2 P(|X-\mu| \geq \varepsilon) $
$ \Rightarrow P(|X-\mu| \geq \varepsilon) \leq \frac{\sigma^2}{\varepsilon^2} $

Alumni Liaison

has a message for current ECE438 students.

Sean Hu, ECE PhD 2009