(New page: 2.3 Weak law of large numbers Let <math>\left\{ \mathbf{X}_{n}\right\}</math> be a sequence of <math>i.i.d.</math> random variables with mean <math>\mu</math> and variance <math>\sigm...)
 
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
2.3 Weak law of large numbers
+
=2.3 Weak law of large numbers=
  
Let <math>\left\{ \mathbf{X}_{n}\right\}</math>  be a sequence of <math>i.i.d.</math>  random variables with mean <math>\mu</math>  and variance <math>\sigma^{2}</math> . Define <math>\mathbf{Y}_{n}=\frac{1}{n}\sum_{k=1}^{n}\mathbf{X}_{k},\quad n=1,2,\cdots</math> . Then for any <math>\epsilon>0</math> , <math>p\left(\left\{ \left|\mathbf{Y}_{n}-\mu\right|>\epsilon\right\} \right)\rightarrow0</math>  as <math>n\rightarrow\infty  </math>(convergence in probability).
+
Let <math class="inline">\left\{ \mathbf{X}_{n}\right\}</math>  be a sequence of <math class="inline">i.i.d.</math>  random variables with mean <math class="inline">\mu</math>  and variance <math class="inline">\sigma^{2}</math> . Define <math class="inline">\mathbf{Y}_{n}=\frac{1}{n}\sum_{k=1}^{n}\mathbf{X}_{k},\quad n=1,2,\cdots</math> . Then for any <math class="inline">\epsilon>0</math> , <math class="inline">p\left(\left\{ \left|\mathbf{Y}_{n}-\mu\right|>\epsilon\right\} \right)\rightarrow0</math>  as <math class="inline">n\rightarrow\infty  </math>(convergence in probability).
  
<math>\mathbf{Y}_{n}\longrightarrow\left(p\right)\longrightarrow\mu\text{ as }n\longrightarrow\infty.</math>  
+
<math class="inline">\mathbf{Y}_{n}\longrightarrow\left(p\right)\longrightarrow\mu\text{ as }n\longrightarrow\infty.</math>  
  
 
Proof
 
Proof
  
<math>E\left[\mathbf{Y}_{n}\right]=E\left[\frac{1}{n}\sum_{k=1}^{n}\mathbf{X}_{k}\right]=\frac{1}{n}\sum_{k=1}^{n}E\left[\mathbf{X}_{k}\right]=\frac{1}{n}\cdot\left(n\mu\right)=\mu.</math>  
+
<math class="inline">E\left[\mathbf{Y}_{n}\right]=E\left[\frac{1}{n}\sum_{k=1}^{n}\mathbf{X}_{k}\right]=\frac{1}{n}\sum_{k=1}^{n}E\left[\mathbf{X}_{k}\right]=\frac{1}{n}\cdot\left(n\mu\right)=\mu.</math>  
  
<math>Var\left[\mathbf{Y}_{n}\right]</math>  
+
<math class="inline">Var\left[\mathbf{Y}_{n}\right]</math>  
  
 
By the Chebyshev inequality,
 
By the Chebyshev inequality,
  
<math>p\left(\left\{ \left|\mathbf{Y}_{n}-\mu\right|\geq\epsilon\right\} \right)\leq\frac{\sigma^{2}}{n\epsilon^{2}}\longrightarrow\left(n\rightarrow\infty\right)\longrightarrow0.</math>  
+
<math class="inline">p\left(\left\{ \left|\mathbf{Y}_{n}-\mu\right|\geq\epsilon\right\} \right)\leq\frac{\sigma^{2}}{n\epsilon^{2}}\longrightarrow\left(n\rightarrow\infty\right)\longrightarrow0.</math>  
  
<math>\Longrightarrow\mathbf{Y}_{n}\longrightarrow\left(p\right)\longrightarrow\mu\text{ as }n\longrightarrow\infty.\blacksquare </math>
+
<math class="inline">\Longrightarrow\mathbf{Y}_{n}\longrightarrow\left(p\right)\longrightarrow\mu\text{ as }n\longrightarrow\infty.\blacksquare </math>
  
 
Note
 
Note
Line 23: Line 23:
 
Note
 
Note
  
There are also stronger forms of the law of large numbers. Strong one uses coveriance <math>\left(a.e.\right)</math> as well as weak one uses coveriance <math>\left(p\right)</math> .
+
There are also stronger forms of the law of large numbers. Strong one uses coveriance <math class="inline">\left(a.e.\right)</math> as well as weak one uses coveriance <math class="inline">\left(p\right)</math> .
 +
 
 +
----
 +
[[ECE600|Back to ECE600]]
 +
 
 +
[[ECE 600 Sequences of Random Variables|Back to Sequences of Random Variables]]

Latest revision as of 10:39, 30 November 2010

2.3 Weak law of large numbers

Let $ \left\{ \mathbf{X}_{n}\right\} $ be a sequence of $ i.i.d. $ random variables with mean $ \mu $ and variance $ \sigma^{2} $ . Define $ \mathbf{Y}_{n}=\frac{1}{n}\sum_{k=1}^{n}\mathbf{X}_{k},\quad n=1,2,\cdots $ . Then for any $ \epsilon>0 $ , $ p\left(\left\{ \left|\mathbf{Y}_{n}-\mu\right|>\epsilon\right\} \right)\rightarrow0 $ as $ n\rightarrow\infty $(convergence in probability).

$ \mathbf{Y}_{n}\longrightarrow\left(p\right)\longrightarrow\mu\text{ as }n\longrightarrow\infty. $

Proof

$ E\left[\mathbf{Y}_{n}\right]=E\left[\frac{1}{n}\sum_{k=1}^{n}\mathbf{X}_{k}\right]=\frac{1}{n}\sum_{k=1}^{n}E\left[\mathbf{X}_{k}\right]=\frac{1}{n}\cdot\left(n\mu\right)=\mu. $

$ Var\left[\mathbf{Y}_{n}\right] $

By the Chebyshev inequality,

$ p\left(\left\{ \left|\mathbf{Y}_{n}-\mu\right|\geq\epsilon\right\} \right)\leq\frac{\sigma^{2}}{n\epsilon^{2}}\longrightarrow\left(n\rightarrow\infty\right)\longrightarrow0. $

$ \Longrightarrow\mathbf{Y}_{n}\longrightarrow\left(p\right)\longrightarrow\mu\text{ as }n\longrightarrow\infty.\blacksquare $

Note

You can show this is true as long as the mean exists. The variance need not exist. Proof for this is harder and not responsible for this.

Note

There are also stronger forms of the law of large numbers. Strong one uses coveriance $ \left(a.e.\right) $ as well as weak one uses coveriance $ \left(p\right) $ .


Back to ECE600

Back to Sequences of Random Variables

Alumni Liaison

EISL lab graduate

Mu Qiao