(New page: ='''2.1 Converge'''= Definition. Converge A sequence of numbers <math>x_{1},x_{2},\cdots,x_{n},\cdots</math> is said to converge to a limit <math>x</math> if, for every <math>\epsilon>...)
 
Line 1: Line 1:
='''2.1 Converge'''=
+
= '''2.1 Converge''' =
  
Definition. Converge
+
Definition. Converge  
  
A sequence of numbers <math>x_{1},x_{2},\cdots,x_{n},\cdots</math> is said to converge to a limit <math>x</math> if, for every <math>\epsilon>0</math> , there exists a number <math>n_{\epsilon}\in\mathbf{N}</math> such that <math>\left|x_{n}-x\right|<\epsilon,\;\forall n\geq n_{\epsilon}.</math> <math>\mbox{"}x_{n}\rightarrow x\mbox{ as }n\rightarrow\infty\mbox{"}</math>. Given a random sequence <math>\mathbf{X}_{1}\left(\omega\right),\mathbf{X}_{2}\left(\omega\right),\cdots,\mathbf{X}_{n}\left(\omega\right),\cdots</math>  for any particular <math>\omega_{0}\in S</math> , we have <math>\mathbf{X}_{1}\left(\omega_{0}\right),\mathbf{X}_{2}\left(\omega_{0}\right),\cdots,\mathbf{X}_{n}\left(\omega_{0}\right)</math>  is a sequence of real numbers.
+
A sequence of numbers <span class="texhtml">''x''<sub>1</sub>,''x''<sub>2</sub>,⋅⋅⋅,''x''<sub>''n''</sub>,⋅⋅⋅</span> is said to converge to a limit <span class="texhtml">''x''</span> if, for every <span class="texhtml">ε &gt; 0</span> , there exists a number <math>n_{\epsilon}\in\mathbf{N}</math> such that <math>\left|x_{n}-x\right|<\epsilon,\;\forall n\geq n_{\epsilon}</math>.  
  
• It may converge to a number <math>\mathbf{X}\left(\omega_{0}\right)</math> that may be a function of <math>\omega_{0}</math> .
+
"<span class="texhtml">''x''<sub>''n''</sub>→''x'' as ''n''→∞</span>".  
  
• It may not converge.
+
Given a random sequence <math>\mathbf{X}_{1}\left(\omega\right),\mathbf{X}_{2}\left(\omega\right),\cdots,\mathbf{X}_{n}\left(\omega\right),\cdots</math> for any particular <span class="texhtml">ω<sub>0</sub>∈''S''</span> , we have <math>\mathbf{X}_{1}\left(\omega_{0}\right),\mathbf{X}_{2}\left(\omega_{0}\right),\cdots,\mathbf{X}_{n}\left(\omega_{0}\right)</math> is a sequence of real numbers.  
  
Most likely, <math>\left\{ \mathbf{X}_{n}\left(\omega\right)\right\}</math>   converge for some <math>\omega\in S</math> and will diverge for other <math>\omega\in S</math> . When we study stochastic convergence, we study the set <math>A\subset S</math> for which <math>\mathbf{X}_{1}\left(\omega\right),\mathbf{X}_{2}\left(\omega\right),\cdots,\mathbf{X}_{n}\left(\omega\right),\cdots</math>  is a convergent sequence of real numbers.
+
• It may converge to a number <math>\mathbf{X}\left(\omega_{0}\right)</math> that may be a function of <span class="texhtml">ω<sub>0</sub></span> .  
  
2.1.1 Definition. Converge everywhere
+
• It may not converge.  
  
We say a sequence of random variables converges everywhere (e) if the sequence <math>\mathbf{X}_{1}\left(\omega\right),\mathbf{X}_{2}\left(\omega\right),\cdots,\mathbf{X}_{n}\left(\omega\right),\cdots</math>  each converge to a number <math>\mathbf{X}\left(\omega\right)</math>  for each <math>\omega\in\mathcal{S}</math> .
+
Most likely, <math>\left\{ \mathbf{X}_{n}\left(\omega\right)\right\}</math> converge for some <span class="texhtml">ω∈''S''</span> and will diverge for other <span class="texhtml">ω∈''S''</span> . When we study stochastic convergence, we study the set <span class="texhtml">''A''⊂''S''</span> for which <math>\mathbf{X}_{1}\left(\omega\right),\mathbf{X}_{2}\left(\omega\right),\cdots,\mathbf{X}_{n}\left(\omega\right),\cdots</math> is a convergent sequence of real numbers.  
  
Note
+
2.1.1 Definition. Converge everywhere
  
• The number <math>\mathbf{X}\left(\omega\right)</math>  that <math>\left\{ \mathbf{X}_{n}\left(\omega\right)\right\}</math>   converges to is in general a function of <math>\omega</math> .
+
We say a sequence of random variables converges everywhere (e) if the sequence <math>\mathbf{X}_{1}\left(\omega\right),\mathbf{X}_{2}\left(\omega\right),\cdots,\mathbf{X}_{n}\left(\omega\right),\cdots</math> each converge to a number <math>\mathbf{X}\left(\omega\right)</math> for each <math>\omega\in\mathcal{S}</math> .  
  
• Convergence (e) is too strong to be useful.
+
Note
  
2.1.2 Definition. Converge almost everywhere
+
• The number <math>\mathbf{X}\left(\omega\right)</math> that <math>\left\{ \mathbf{X}_{n}\left(\omega\right)\right\}</math> converges to is in general a function of <span class="texhtml">ω</span> .  
  
A random sequence <math>\left\{ \mathbf{X}_{n}\left(\omega\right)\right\}</math>  converges almost everywhere (a.e.) if the set of outcomes <math>A\subset\mathcal{S}</math>  such that <math>\mathbf{X}_{n}\left(\omega\right)\rightarrow\mathbf{X}\left(\omega\right),\;\omega\in A</math>  exists and has probability 1: <math>P\left(A\right)=1</math> . Other names for this are: almost surely (a.s.) and convergence with probability one. We write this as “<math>\mathbf{X}_{n}\rightarrow(a.e)\rightarrow\mathbf{X}</math> ” or “<math>P\left(\left\{ \mathbf{X}_{n}\rightarrow\mathbf{X}\right\} \right)=1.</math> ”
+
• Convergence (e) is too strong to be useful.  
  
2.1.3 Definition. Converge in mean-square
+
2.1.2 Definition. Converge almost everywhere
  
We say that a random sequence converges in mean-square (m.s.) to a random variable \mathbf{X} if E\left[\left|\mathbf{X}_{n}-\mathbf{X}\right|^{2}\right]\rightarrow0\textrm{ as }n\rightarrow\infty.  
+
A random sequence <math>\left\{ \mathbf{X}_{n}\left(\omega\right)\right\}</math> converges almost everywhere (a.e.) if the set of outcomes <math>A\subset\mathcal{S}</math> such that <math>\mathbf{X}_{n}\left(\omega\right)\rightarrow\mathbf{X}\left(\omega\right),\;\omega\in A</math> exists and has probability 1: <math>P\left(A\right)=1</math> . Other names for this are: almost surely (a.s.) and convergence with probability one. We write this as “<math>\mathbf{X}_{n}\rightarrow(a.e)\rightarrow\mathbf{X}</math> ” or “<math>P\left(\left\{ \mathbf{X}_{n}\rightarrow\mathbf{X}\right\} \right)=1.</math> ”
  
Note
+
2.1.3 Definition. Converge in mean-square
  
Convergence (m.s.) is also called “limit in the mean convergence” and is written “l.i.m. \mathbf{X}_{n}=\mathbf{X} ” (bad). Better notation is \mathbf{X}_{n}\rightarrow(m.s.)\rightarrow\mathbf{X} .
+
We say that a random sequence converges in mean-square (m.s.) to a random variable \mathbf{X} if E\left[\left|\mathbf{X}_{n}-\mathbf{X}\right|^{2}\right]\rightarrow0\textrm{ as }n\rightarrow\infty.  
  
2.1.4 Definition. Converge in probability
+
Note
  
A random sequence \left\{ \mathbf{X}_{n}\left(\omega\right)\right\}  converges in probability (p) to a random variable \mathbf{X} if, \forall\epsilon>0  P\left(\left\{ \left|\mathbf{X}_{n}-\mathbf{X}\right|>\epsilon\right\} \right)\rightarrow0\textrm{ as }n\rightarrow\infty.  
+
Convergence (m.s.) is also called “limit in the mean convergence” and is written “l.i.m. \mathbf{X}_{n}=\mathbf{X} (bad). Better notation is \mathbf{X}_{n}\rightarrow(m.s.)\rightarrow\mathbf{X} .  
  
As opposed to P\left(\left\{ \mathbf{X}_{n}\rightarrow(a.e.)\rightarrow\mathbf{X}\right\} \right) . Convergence (a.e.) is a much stronger form of convergence.
+
2.1.4 Definition. Converge in probability
  
2.1.5 Definition. Converge in distribution
+
A random sequence \left\{ \mathbf{X}_{n}\left(\omega\right)\right\} converges in probability (p) to a random variable \mathbf{X} if, \forall\epsilon&gt;0 P\left(\left\{ \left|\mathbf{X}_{n}-\mathbf{X}\right|&gt;\epsilon\right\} \right)\rightarrow0\textrm{ as }n\rightarrow\infty.  
  
A random sequence \left\{ \mathbf{X}_{n}\left(\omega\right)\right\}  converges in distribution (d) to a random variable \mathbf{X}  if F_{\mathbf{X}_{n}}\left(x\right)\rightarrow F_{\mathbf{X}}\left(x\right)  at every point x\in\mathbf{R} where F_{\mathbf{X}}\left(x\right) is continuous.
+
As opposed to P\left(\left\{ \mathbf{X}_{n}\rightarrow(a.e.)\rightarrow\mathbf{X}\right\} \right) . Convergence (a.e.) is a much stronger form of convergence.  
  
Example: Central Limit Theorem
+
2.1.5 Definition. Converge in distribution
  
2.1.6 Definition. Converge in density
+
A random sequence \left\{ \mathbf{X}_{n}\left(\omega\right)\right\} converges in distribution (d) to a random variable \mathbf{X} if F_{\mathbf{X}_{n}}\left(x\right)\rightarrow F_{\mathbf{X}}\left(x\right) at every point x\in\mathbf{R} where F_{\mathbf{X}}\left(x\right) is continuous.  
  
A random sequence \left\{ \mathbf{X}_{n}\left(\omega\right)\right\}  converges in density (density) to a random variable \mathbf{X}  if f_{\mathbf{X}_{n}}\left(x\right)\rightarrow f_{\mathbf{X}}\left(x\right)\textrm{ as }n\rightarrow\infty  for every x\in\mathbf{R}  where F_{\mathbf{X}}\left(x\right)  is continuous.
+
Example: Central Limit Theorem
  
2.1.7 Convergence in distribution vs. convergence in density
+
2.1.6 Definition. Converge in density  
  
• Aren't convergence in density and distribution equivalent? NO!
+
A random sequence \left\{ \mathbf{X}_{n}\left(\omega\right)\right\} converges in density (density) to a random variable \mathbf{X} if f_{\mathbf{X}_{n}}\left(x\right)\rightarrow f_{\mathbf{X}}\left(x\right)\textrm{ as }n\rightarrow\infty for every x\in\mathbf{R} where F_{\mathbf{X}}\left(x\right) is continuous.
  
• Example: Let \left\{ \mathbf{X}_{n}\left(\omega\right)\right\}  be a sequence of random variables with \mathbf{X}_{n}  having pdf f_{\mathbf{X}_{n}}\left(x\right)=\left[1+\cos\left(2\pi nx\right)\right]\cdot\mathbf{1}_{\left[0,1\right]}\left(x\right). f_{\mathbf{X}_{n}}\left(x\right)  is a valid pdf for n=1,2,3,\cdots. The cdf of \mathbf{X}_{n}  is F_{\mathbf{X}_{n}}\left(x\right)=\left\{ \begin{array}{lll}
+
2.1.7 Convergence in distribution vs. convergence in density
0 & , & x<0\\
+
x+\frac{1}{2\pi n}\sin\left(x2\pi n\right) & , & x\in\left[0,1\right]\\
+
1 & , & x>1.
+
\end{array}\right.  
+
  
Now defineF_{\mathbf{X}}\left(x\right)=\left\{ \begin{array}{lll}
+
Aren't convergence in density and distribution equivalent? NO!
0 & , & x<0\\
+
x & , & x\in\left[0,1\right]\\
+
1 & , & x>1.
+
\end{array}\right.
+
  
Because F_{\mathbf{X}_{n}}\left(x\right)\rightarrow F_{\mathbf{X}}\left(x\right) as n\rightarrow\infty ,\therefore\mathbf{X}_{n}\rightarrow\left(d\right)\rightarrow\mathbf{X}.  
+
Example: Let \left\{ \mathbf{X}_{n}\left(\omega\right)\right\} be a sequence of random variables with \mathbf{X}_{n} having pdf f_{\mathbf{X}_{n}}\left(x\right)=\left[1+\cos\left(2\pi nx\right)\right]\cdot\mathbf{1}_{\left[0,1\right]}\left(x\right). f_{\mathbf{X}_{n}}\left(x\right) is a valid pdf for n=1,2,3,\cdots. The cdf of \mathbf{X}_{n} is F_{\mathbf{X}_{n}}\left(x\right)=\left\{ \begin{array}{lll} 0 &amp; , &amp; x&lt;0\\ x+\frac{1}{2\pi n}\sin\left(x2\pi n\right) &amp; , &amp; x\in\left[0,1\right]\\ 1 &amp; , &amp; x&gt;1. \end{array}\right.  
  
The pdf of \mathbf{X}  corresponding to F_{\mathbf{X}}\left(x\right) is f_{\mathbf{X}}\left(x\right)=\mathbf{1}_{\left[0,1\right]}\left(x\right).  
+
Now defineF_{\mathbf{X}}\left(x\right)=\left\{ \begin{array}{lll} 0 &amp; , &amp; x&lt;0\\ x &amp; , &amp; x\in\left[0,1\right]\\ 1 &amp; , &amp; x&gt;1. \end{array}\right.  
  
What does f_{\mathbf{X}_{n}}\left(x\right) look like? We do not have convergence in density.  
+
Because F_{\mathbf{X}_{n}}\left(x\right)\rightarrow F_{\mathbf{X}}\left(x\right) as n\rightarrow\infty ,\therefore\mathbf{X}_{n}\rightarrow\left(d\right)\rightarrow\mathbf{X}.  
  
• \therefore  Convergence in density and convergence in distribution are NOT equivalent. In fact, convergence (density) \left(\nLeftarrow\right)\Longrightarrow  convergence (distribution)
+
The pdf of \mathbf{X} corresponding to F_{\mathbf{X}}\left(x\right) is f_{\mathbf{X}}\left(x\right)=\mathbf{1}_{\left[0,1\right]}\left(x\right).
  
2.1.8 Cauchy criterion for convergence
+
• What does f_{\mathbf{X}_{n}}\left(x\right) look like? We do not have convergence in density.  
  
Recaull that a sequence of numbers x_{1},x_{2},\cdots,x_{n}  converges to x  if \forall\epsilon>0 , \exists n_{\epsilon}\in\mathbf{N}  such that \left|x_{n}-x\right|<\epsilon,\;\forall n\geq n_{\epsilon}.  To use this definition, you must know x . The Cauchy criterion gives us a way to test for convergence without knowing the limit x .
+
\therefore Convergence in density and convergence in distribution are NOT equivalent. In fact, convergence (density) \left(\nLeftarrow\right)\Longrightarrow convergence (distribution)
  
Cauchy criterion
+
2.1.8 Cauchy criterion for convergence
  
If \left\{ x_{n}\right\}  is a sequence of real numbers and \left|x_{n+m}-x_{n}\right|\rightarrow0  as n\rightarrow\infty  for all m\in\mathbf{N} , then \left\{ x_{n}\right\}   converges to a real number.
+
Recaull that a sequence of numbers x_{1},x_{2},\cdots,x_{n} converges to x if \forall\epsilon&gt;0 , \exists n_{\epsilon}\in\mathbf{N} such that \left|x_{n}-x\right|&lt;\epsilon,\;\forall n\geq n_{\epsilon}. To use this definition, you must know x . The Cauchy criterion gives us a way to test for convergence without knowing the limit x .  
  
Note
+
Cauchy criterion
  
The Cauchy criterion can be applied to various forms of stochastic convergence. We look at:
+
If \left\{ x_{n}\right\} is a sequence of real numbers and \left|x_{n+m}-x_{n}\right|\rightarrow0 as n\rightarrow\infty for all m\in\mathbf{N} , then \left\{ x_{n}\right\} converges to a real number.  
  
\mathbf{X}_{n}\rightarrow\mathbf{X}  (original)
+
Note
 +
 
 +
The Cauchy criterion can be applied to various forms of stochastic convergence. We look at:
  
\mathbf{X}_{n} and \mathbf{X}_{n+m}  (Cauchy criterion)
+
\mathbf{X}_{n}\rightarrow\mathbf{X} (original)  
  
e.g.
+
\mathbf{X}_{n} and \mathbf{X}_{n+m} (Cauchy criterion)
  
If \varphi\left(n,m\right)=E\left[\left|\mathbf{X}_{n}-\mathbf{X}_{n+m}\right|^{2}\right]\rightarrow0  as n\rightarrow\infty  for all m=1,2,\cdots , then \left\{ \mathbf{X}_{n}\right\}  converges in mean-square.
+
e.g.  
  
2.1.9 Comparison of modes of convergence
+
If \varphi\left(n,m\right)=E\left[\left|\mathbf{X}_{n}-\mathbf{X}_{n+m}\right|^{2}\right]\rightarrow0 as n\rightarrow\infty for all m=1,2,\cdots , then \left\{ \mathbf{X}_{n}\right\} converges in mean-square.  
  
 +
2.1.9 Comparison of modes of convergence
  
 +
<br>
  
convergence \left(m.s.\right) \Longrightarrow convergence \left(p\right)  
+
convergence \left(m.s.\right) \Longrightarrow convergence \left(p\right)  
  
p\left(\left\{ \left|\mathbf{X}-\mu\right|>\epsilon\right\} \right)\leq\frac{E\left[\left(\mathbf{X}-\mu\right)^{2}\right]}{\epsilon^{2}}=\frac{\sigma_{\mathbf{X}}^{2}}{\epsilon^{2}}  
+
p\left(\left\{ \left|\mathbf{X}-\mu\right|&gt;\epsilon\right\} \right)\leq\frac{E\left[\left(\mathbf{X}-\mu\right)^{2}\right]}{\epsilon^{2}}=\frac{\sigma_{\mathbf{X}}^{2}}{\epsilon^{2}}  
  
\Longrightarrow p\left(\left\{ \left|\mathbf{X}_{n}-\mathbf{X}\right|>\epsilon\right\} \right)\leq\frac{E\left[\left(\mathbf{X}_{n}-\mathbf{X}\right)^{2}\right]}{\epsilon^{2}}.  
+
\Longrightarrow p\left(\left\{ \left|\mathbf{X}_{n}-\mathbf{X}\right|&gt;\epsilon\right\} \right)\leq\frac{E\left[\left(\mathbf{X}_{n}-\mathbf{X}\right)^{2}\right]}{\epsilon^{2}}.  
  
Thus, m.s. convergence \Longrightarrow E\left[\left(\mathbf{X}_{n}-\mathbf{X}\right)^{2}\right]\rightarrow0 as n\rightarrow\infty \Longrightarrow p\left(\left\{ \left|\mathbf{X}_{n}-\mathbf{X}\right|>\epsilon\right\} \right)\rightarrow0 as n\rightarrow\infty .
+
Thus, m.s. convergence \Longrightarrow E\left[\left(\mathbf{X}_{n}-\mathbf{X}\right)^{2}\right]\rightarrow0 as n\rightarrow\infty \Longrightarrow p\left(\left\{ \left|\mathbf{X}_{n}-\mathbf{X}\right|&gt;\epsilon\right\} \right)\rightarrow0 as n\rightarrow\infty .  
  
convergence \left(a.e.\right) \Longrightarrow convergence \left(p\right)  
+
convergence \left(a.e.\right) \Longrightarrow convergence \left(p\right)  
  
Follows from definitions, converse is not true.
+
Follows from definitions, converse is not true.  
  
convergence \left(d\right) is “weaker than” convergence \left(a.e.\right) , \left(m.s.\right) , or \left(p\right) .
+
convergence \left(d\right) is “weaker than” convergence \left(a.e.\right) , \left(m.s.\right) , or \left(p\right) .  
  
\left(a.e.\right)\Rightarrow\left(d\right) , \left(m.s.\right)\Rightarrow\left(d\right) , and \left(p\right)\Rightarrow\left(d\right) .
+
\left(a.e.\right)\Rightarrow\left(d\right) , \left(m.s.\right)\Rightarrow\left(d\right) , and \left(p\right)\Rightarrow\left(d\right) .  
  
Note
+
Note  
  
\left(a.e.\right)\nRightarrow\left(m.s.\right) and \left(m.s.\right)\nRightarrow\left(a.e.\right) .
+
\left(a.e.\right)\nRightarrow\left(m.s.\right) and \left(m.s.\right)\nRightarrow\left(a.e.\right) .  
  
Note
+
Note  
  
The Chebyshev inequality is a valuable tool for working with m.s. convergence.
+
The Chebyshev inequality is a valuable tool for working with m.s. convergence.

Revision as of 06:55, 17 November 2010

2.1 Converge

Definition. Converge

A sequence of numbers x1,x2,⋅⋅⋅,xn,⋅⋅⋅ is said to converge to a limit x if, for every ε > 0 , there exists a number $ n_{\epsilon}\in\mathbf{N} $ such that $ \left|x_{n}-x\right|<\epsilon,\;\forall n\geq n_{\epsilon} $.

"xnx as n→∞".

Given a random sequence $ \mathbf{X}_{1}\left(\omega\right),\mathbf{X}_{2}\left(\omega\right),\cdots,\mathbf{X}_{n}\left(\omega\right),\cdots $ for any particular ω0S , we have $ \mathbf{X}_{1}\left(\omega_{0}\right),\mathbf{X}_{2}\left(\omega_{0}\right),\cdots,\mathbf{X}_{n}\left(\omega_{0}\right) $ is a sequence of real numbers.

• It may converge to a number $ \mathbf{X}\left(\omega_{0}\right) $ that may be a function of ω0 .

• It may not converge.

Most likely, $ \left\{ \mathbf{X}_{n}\left(\omega\right)\right\} $ converge for some ω∈S and will diverge for other ω∈S . When we study stochastic convergence, we study the set AS for which $ \mathbf{X}_{1}\left(\omega\right),\mathbf{X}_{2}\left(\omega\right),\cdots,\mathbf{X}_{n}\left(\omega\right),\cdots $ is a convergent sequence of real numbers.

2.1.1 Definition. Converge everywhere

We say a sequence of random variables converges everywhere (e) if the sequence $ \mathbf{X}_{1}\left(\omega\right),\mathbf{X}_{2}\left(\omega\right),\cdots,\mathbf{X}_{n}\left(\omega\right),\cdots $ each converge to a number $ \mathbf{X}\left(\omega\right) $ for each $ \omega\in\mathcal{S} $ .

Note

• The number $ \mathbf{X}\left(\omega\right) $ that $ \left\{ \mathbf{X}_{n}\left(\omega\right)\right\} $ converges to is in general a function of ω .

• Convergence (e) is too strong to be useful.

2.1.2 Definition. Converge almost everywhere

A random sequence $ \left\{ \mathbf{X}_{n}\left(\omega\right)\right\} $ converges almost everywhere (a.e.) if the set of outcomes $ A\subset\mathcal{S} $ such that $ \mathbf{X}_{n}\left(\omega\right)\rightarrow\mathbf{X}\left(\omega\right),\;\omega\in A $ exists and has probability 1: $ P\left(A\right)=1 $ . Other names for this are: almost surely (a.s.) and convergence with probability one. We write this as “$ \mathbf{X}_{n}\rightarrow(a.e)\rightarrow\mathbf{X} $ ” or “$ P\left(\left\{ \mathbf{X}_{n}\rightarrow\mathbf{X}\right\} \right)=1. $

2.1.3 Definition. Converge in mean-square

We say that a random sequence converges in mean-square (m.s.) to a random variable \mathbf{X} if E\left[\left|\mathbf{X}_{n}-\mathbf{X}\right|^{2}\right]\rightarrow0\textrm{ as }n\rightarrow\infty.

Note

Convergence (m.s.) is also called “limit in the mean convergence” and is written “l.i.m. \mathbf{X}_{n}=\mathbf{X} ” (bad). Better notation is \mathbf{X}_{n}\rightarrow(m.s.)\rightarrow\mathbf{X} .

2.1.4 Definition. Converge in probability

A random sequence \left\{ \mathbf{X}_{n}\left(\omega\right)\right\} converges in probability (p) to a random variable \mathbf{X} if, \forall\epsilon>0 P\left(\left\{ \left|\mathbf{X}_{n}-\mathbf{X}\right|>\epsilon\right\} \right)\rightarrow0\textrm{ as }n\rightarrow\infty.

As opposed to P\left(\left\{ \mathbf{X}_{n}\rightarrow(a.e.)\rightarrow\mathbf{X}\right\} \right) . Convergence (a.e.) is a much stronger form of convergence.

2.1.5 Definition. Converge in distribution

A random sequence \left\{ \mathbf{X}_{n}\left(\omega\right)\right\} converges in distribution (d) to a random variable \mathbf{X} if F_{\mathbf{X}_{n}}\left(x\right)\rightarrow F_{\mathbf{X}}\left(x\right) at every point x\in\mathbf{R} where F_{\mathbf{X}}\left(x\right) is continuous.

Example: Central Limit Theorem

2.1.6 Definition. Converge in density

A random sequence \left\{ \mathbf{X}_{n}\left(\omega\right)\right\} converges in density (density) to a random variable \mathbf{X} if f_{\mathbf{X}_{n}}\left(x\right)\rightarrow f_{\mathbf{X}}\left(x\right)\textrm{ as }n\rightarrow\infty for every x\in\mathbf{R} where F_{\mathbf{X}}\left(x\right) is continuous.

2.1.7 Convergence in distribution vs. convergence in density

• Aren't convergence in density and distribution equivalent? NO!

• Example: Let \left\{ \mathbf{X}_{n}\left(\omega\right)\right\} be a sequence of random variables with \mathbf{X}_{n} having pdf f_{\mathbf{X}_{n}}\left(x\right)=\left[1+\cos\left(2\pi nx\right)\right]\cdot\mathbf{1}_{\left[0,1\right]}\left(x\right). f_{\mathbf{X}_{n}}\left(x\right) is a valid pdf for n=1,2,3,\cdots. The cdf of \mathbf{X}_{n} is F_{\mathbf{X}_{n}}\left(x\right)=\left\{ \begin{array}{lll} 0 & , & x<0\\ x+\frac{1}{2\pi n}\sin\left(x2\pi n\right) & , & x\in\left[0,1\right]\\ 1 & , & x>1. \end{array}\right.

• Now defineF_{\mathbf{X}}\left(x\right)=\left\{ \begin{array}{lll} 0 & , & x<0\\ x & , & x\in\left[0,1\right]\\ 1 & , & x>1. \end{array}\right.

• Because F_{\mathbf{X}_{n}}\left(x\right)\rightarrow F_{\mathbf{X}}\left(x\right) as n\rightarrow\infty ,\therefore\mathbf{X}_{n}\rightarrow\left(d\right)\rightarrow\mathbf{X}.

• The pdf of \mathbf{X} corresponding to F_{\mathbf{X}}\left(x\right) is f_{\mathbf{X}}\left(x\right)=\mathbf{1}_{\left[0,1\right]}\left(x\right).

• What does f_{\mathbf{X}_{n}}\left(x\right) look like? We do not have convergence in density.

• \therefore Convergence in density and convergence in distribution are NOT equivalent. In fact, convergence (density) \left(\nLeftarrow\right)\Longrightarrow convergence (distribution)

2.1.8 Cauchy criterion for convergence

Recaull that a sequence of numbers x_{1},x_{2},\cdots,x_{n} converges to x if \forall\epsilon>0 , \exists n_{\epsilon}\in\mathbf{N} such that \left|x_{n}-x\right|<\epsilon,\;\forall n\geq n_{\epsilon}. To use this definition, you must know x . The Cauchy criterion gives us a way to test for convergence without knowing the limit x .

Cauchy criterion

If \left\{ x_{n}\right\} is a sequence of real numbers and \left|x_{n+m}-x_{n}\right|\rightarrow0 as n\rightarrow\infty for all m\in\mathbf{N} , then \left\{ x_{n}\right\} converges to a real number.

Note

The Cauchy criterion can be applied to various forms of stochastic convergence. We look at:

\mathbf{X}_{n}\rightarrow\mathbf{X} (original)

\mathbf{X}_{n} and \mathbf{X}_{n+m} (Cauchy criterion)

e.g.

If \varphi\left(n,m\right)=E\left[\left|\mathbf{X}_{n}-\mathbf{X}_{n+m}\right|^{2}\right]\rightarrow0 as n\rightarrow\infty for all m=1,2,\cdots , then \left\{ \mathbf{X}_{n}\right\} converges in mean-square.

2.1.9 Comparison of modes of convergence


convergence \left(m.s.\right) \Longrightarrow convergence \left(p\right)

p\left(\left\{ \left|\mathbf{X}-\mu\right|>\epsilon\right\} \right)\leq\frac{E\left[\left(\mathbf{X}-\mu\right)^{2}\right]}{\epsilon^{2}}=\frac{\sigma_{\mathbf{X}}^{2}}{\epsilon^{2}}

\Longrightarrow p\left(\left\{ \left|\mathbf{X}_{n}-\mathbf{X}\right|>\epsilon\right\} \right)\leq\frac{E\left[\left(\mathbf{X}_{n}-\mathbf{X}\right)^{2}\right]}{\epsilon^{2}}.

Thus, m.s. convergence \Longrightarrow E\left[\left(\mathbf{X}_{n}-\mathbf{X}\right)^{2}\right]\rightarrow0 as n\rightarrow\infty \Longrightarrow p\left(\left\{ \left|\mathbf{X}_{n}-\mathbf{X}\right|>\epsilon\right\} \right)\rightarrow0 as n\rightarrow\infty .

convergence \left(a.e.\right) \Longrightarrow convergence \left(p\right)

Follows from definitions, converse is not true.

convergence \left(d\right) is “weaker than” convergence \left(a.e.\right) , \left(m.s.\right) , or \left(p\right) .

\left(a.e.\right)\Rightarrow\left(d\right) , \left(m.s.\right)\Rightarrow\left(d\right) , and \left(p\right)\Rightarrow\left(d\right) .

Note

\left(a.e.\right)\nRightarrow\left(m.s.\right) and \left(m.s.\right)\nRightarrow\left(a.e.\right) .

Note

The Chebyshev inequality is a valuable tool for working with m.s. convergence.

Alumni Liaison

Abstract algebra continues the conceptual developments of linear algebra, on an even grander scale.

Dr. Paul Garrett