Line 32: Line 32:
 
=Solution 1=
 
=Solution 1=
  
<math>Var(X)=E(X^2)-E(X)^2</math>
+
<math>E(X_n)=\frac{n-1}{n}E(Y)+\frac{1}{n}E(Z)</math>
  
First,
+
Where
  
<math>E(X^2)=\int_0^{\infty}x^2\lambda{e}^{-\lambda{x}}dx</math>
+
<math>Y \sim N(\frac{n-1}{n}\sigma, \sigma^2)</math>
  
Since
+
<math>Z \sim EXP(\sigma)</math>
  
<math>\begin{array}{l}\int{x}^2\lambda{e}^{-\lambda{x}}dx\\
+
From the property of Normal distribution and exponential distribution,
=\int -x^2 de^{-\lambda x}\\
+
=-x^2e^{-{\lambda}x}+{\int}2xe^{-{\lambda}x}dx\\
+
=-x^2e^{-{\lambda}x}-\frac{2x}{\lambda}e^{\lambda x}+{\int}\frac{e^{-{\lambda}x}}{\lambda}2dx\\
+
=-x^2e^{-{\lambda}x}-\frac{2x}{\lambda}e^{\lambda x}-\frac{2}{\lambda^2}e^{\lambda x}
+
\end{array}</math>,
+
  
We have
+
<math>E(Y)=\frac{n-1}{n}\sigma</math>
  
<math>E(X^2)=-x^2e^{-\lambda x}-\frac{2x}{\lambda}e^{\lambda x}-\frac{2}{\lambda^2}e^{\lambda x}|_0^\infty</math>
+
<math>E(Z)=\frac{1}{\sigma}</math>.
  
By L'Hospital's rule, we have
+
Therefore,
  
<math>\lim_{x\to \infty}x^2e^{-\lambda x} = \lim_{x\to \infty}\frac{x^2}{e^{-\lambda x}}=\lim_{x\to \infty}\frac{2x}{\lambda e^{\lambda x}}=\lim_{x\to \infty}\frac{2}{\lambda^2e^{\lambda x}}=0</math>
+
<math>\lim_{x\to \infty}E(X_n)=\lim_{x\to \infty}(\frac{n-1}{n})^2\sigma+\frac{1}{n}\frac{1}{\sigma}=\sigma</math>.
  
and
+
Also,
  
<math>\lim_{x\to \infty}xe^{\lambda x} = \lim_{x\to \infty} \frac{x}{e^{\lambda x}}=\lim_{x\to \infty} \frac{1}{\lambda e^{\lambda x}} = 0</math>.
+
<math>\lim_{x\to \infty}E(X_{n+m})=\lim_{x\to \infty}(\frac{n+m-1}{n+m})^2\sigma+\frac{1}{n+m}\frac{1}{\sigma}=\sigma</math>.
  
Therefore,
+
Thus,
  
<math>E(X) = \frac{2}{\lambda^2}</math>.
+
<math>\lim_{x\to \infty}E(X_n-X_{n+m})=\lim_{x\to \infty}E(X_n)-\lim_{x\to \infty}E(X_{n+m})=0</math>,
  
Then we take a look at <math>E(X)</math>.  
+
<math>\lim_{x\to \infty}E(X_{n+m}-X_n)=\lim_{x\to \infty}E(X_{n+m})-\lim_{x\to \infty}E(X_n)=0</math>.
  
<math>E(X)=\int_0^{\infty}x\lambda{e}^{-\lambda{x}}dx</math>
+
So we have
  
<math>\begin{array}{l}
+
<math>\lim_{x\to \infty}E(|X_{n+m}-X_n|)=0</math>
\int x\lambda{e}^{-\lambda{x}}dx\\
+
=\int xde^(\lambda x)\\
+
=-xe^{-\lambda x}+\int e^{\lambda x}dx\\
+
=-xe^{-\lambda x}-\frac{1}{x}e^{\lambda x}\\
+
\end{array}</math>
+
  
Similar to the calculation of <math>E(X^2)</math>,
+
for every m.
 
+
<math>E(X)=\frac{1}{\lambda}</math>.
+
 
+
Therefore,
+
  
<math>Var(X)=E(X^2)-E(X)^2=\frac{2}{\lambda^2}-\frac{1}{\lambda^2}=\frac{1}{\lambda^2}</math>.
+
From the Cauchy criterion for mean-square convergence, this sequence converges int he mean-square sense
  
 
----
 
----

Revision as of 12:50, 4 November 2014


ECE Ph.D. Qualifying Exam

Communication, Networking, Signal and Image Processing (CS)

Question 1: Probability and Random Processes

August 2013



Part 4

Consider a sequence of independent random variables $ X_1,X_2,... $, where $ X_n $ has pdf

$ \begin{align}f_n(x)=&(1-\frac{1}{n})\frac{1}{\sqrt{2\pi}\sigma}exp[-\frac{1}{2\sigma^2}(x-\frac{n-1}{n}\sigma)^2]\\ &+\frac{1}{n}\sigma exp(-\sigma x)u(x)\end{align} $.

Does this sequence converge in the mean-square sense? Hint: Use the Cauchy criterion for mean-square convergence, which states that a sequence of random variables $ X_1,X_2,... $ converges in mean-square if and only if $ E[|X_n-X_{n+m}|] \to 0 $ as $ n \to \infty $, for every $ m>0 $.


Solution 1

$ E(X_n)=\frac{n-1}{n}E(Y)+\frac{1}{n}E(Z) $

Where

$ Y \sim N(\frac{n-1}{n}\sigma, \sigma^2) $

$ Z \sim EXP(\sigma) $

From the property of Normal distribution and exponential distribution,

$ E(Y)=\frac{n-1}{n}\sigma $

$ E(Z)=\frac{1}{\sigma} $.

Therefore,

$ \lim_{x\to \infty}E(X_n)=\lim_{x\to \infty}(\frac{n-1}{n})^2\sigma+\frac{1}{n}\frac{1}{\sigma}=\sigma $.

Also,

$ \lim_{x\to \infty}E(X_{n+m})=\lim_{x\to \infty}(\frac{n+m-1}{n+m})^2\sigma+\frac{1}{n+m}\frac{1}{\sigma}=\sigma $.

Thus,

$ \lim_{x\to \infty}E(X_n-X_{n+m})=\lim_{x\to \infty}E(X_n)-\lim_{x\to \infty}E(X_{n+m})=0 $,

$ \lim_{x\to \infty}E(X_{n+m}-X_n)=\lim_{x\to \infty}E(X_{n+m})-\lim_{x\to \infty}E(X_n)=0 $.

So we have

$ \lim_{x\to \infty}E(|X_{n+m}-X_n|)=0 $

for every m.

From the Cauchy criterion for mean-square convergence, this sequence converges int he mean-square sense


Solution 2

$ \begin{align} E(X)&=\int_{-\infty}^{+\infty}xp(x)dx\\ &=\int_{0}^{\infty}x\lambda e^{-\lambda x}dx\\ &=-(xe^{-lambda x}|_0^{\infty}-\int_0^{\infty}e^{-\lambda x}dx)\\ &=\frac{1}{x} \end{align} $

$ \begin{align} E(X^2)&=\int_{-\infty}^{+\infty}x^2p(x)dx\\ &=\int_{0}^{\infty}x^2 \lambda e^{-\lambda x}dx\\ &=-(x^2e^{-lambda x}|_0^{\infty}-\int_0^{\infty}2xe^{-\lambda x}dx)\\ &=\frac{2}{x^2} \end{align} $

Therefore,

$ Var(X)=E(X^2)-E(X)^2=\frac{1}{\lambda^2} $

Critique on Solution 2:

Solution 2 is correct. In addition, calculating $ E(X) $ first is better since the result can be used in calculating $ E(X^2) $.


Back to QE CS question 1, August 2013

Back to ECE Qualifying Exams (QE) page

Alumni Liaison

has a message for current ECE438 students.

Sean Hu, ECE PhD 2009