Line 4: Line 4:
 
='''1.4.1 Bernoulli distribution'''=
 
='''1.4.1 Bernoulli distribution'''=
  
<math>\mathbf{X}=\begin{cases}
+
<math class="inline">\mathbf{X}=\begin{cases}
 
\begin{array}{lll}
 
\begin{array}{lll}
 
1 &  & ,\text{ if success}\\
 
1 &  & ,\text{ if success}\\
Line 10: Line 10:
 
\end{array}\end{cases}</math>  
 
\end{array}\end{cases}</math>  
  
<math>P\left(\left\{ \mathbf{X}=1\right\} \right)=p</math>.  
+
<math class="inline">P\left(\left\{ \mathbf{X}=1\right\} \right)=p</math>.  
  
<math>P\left(\left\{ \mathbf{X}=0\right\} \right)=q\left(=1-p\right)</math>.  
+
<math class="inline">P\left(\left\{ \mathbf{X}=0\right\} \right)=q\left(=1-p\right)</math>.  
  
<math>E\left[\mathbf{X}\right]=1\cdot P\left(\left\{ \mathbf{X}=1\right\} \right)+0\cdot P\left(\left\{ \mathbf{X}=0\right\} \right)=1\cdot p+0\cdot q=p</math>.  
+
<math class="inline">E\left[\mathbf{X}\right]=1\cdot P\left(\left\{ \mathbf{X}=1\right\} \right)+0\cdot P\left(\left\{ \mathbf{X}=0\right\} \right)=1\cdot p+0\cdot q=p</math>.  
  
<math>E\left[\mathbf{X}^{2}\right]=1^{2}\cdot P\left(\left\{ \mathbf{X}=1\right\} \right)+0^{2}\cdot P\left(\left\{ \mathbf{X}=0\right\} \right)=p</math>.  
+
<math class="inline">E\left[\mathbf{X}^{2}\right]=1^{2}\cdot P\left(\left\{ \mathbf{X}=1\right\} \right)+0^{2}\cdot P\left(\left\{ \mathbf{X}=0\right\} \right)=p</math>.  
  
<math>Var\left[\mathbf{X}\right]=E\left[\mathbf{X}^{2}\right]-\left(E\left[\mathbf{X}\right]\right)^{2}=p-p^{2}=p\left(1-p\right)=pq</math>.  
+
<math class="inline">Var\left[\mathbf{X}\right]=E\left[\mathbf{X}^{2}\right]-\left(E\left[\mathbf{X}\right]\right)^{2}=p-p^{2}=p\left(1-p\right)=pq</math>.  
  
 
Moment generating function
 
Moment generating function
  
<math>\phi_{\mathbf{X}}\left(s\right)=E\left[e^{s\mathbf{X}}\right]=e^{s\cdot1}\cdot p+e^{s\cdot0}\cdot q=p\cdot e^{s}+q</math>.  
+
<math class="inline">\phi_{\mathbf{X}}\left(s\right)=E\left[e^{s\mathbf{X}}\right]=e^{s\cdot1}\cdot p+e^{s\cdot0}\cdot q=p\cdot e^{s}+q</math>.  
  
 
='''1.4.2 Binomial distribution'''=
 
='''1.4.2 Binomial distribution'''=
  
If <math>\mathbf{Y}_{1},\mathbf{Y}_{2},\cdots</math>  are i.i.d.  Bernoulli random variables, then Binomial random variable is defined as <math>\mathbf{X}=\mathbf{Y}_{1}+\mathbf{Y}_{2}+\cdots+\mathbf{Y}_{n}</math>, which represents the number of success from <math>n</math>  Bernoulli trials.
+
If <math class="inline">\mathbf{Y}_{1},\mathbf{Y}_{2},\cdots</math>  are i.i.d.  Bernoulli random variables, then Binomial random variable is defined as <math class="inline">\mathbf{X}=\mathbf{Y}_{1}+\mathbf{Y}_{2}+\cdots+\mathbf{Y}_{n}</math>, which represents the number of success from <math class="inline">n</math>  Bernoulli trials.
  
<math>p_{\mathbf{X}}\left(k\right)=\left(\begin{array}{c}
+
<math class="inline">p_{\mathbf{X}}\left(k\right)=\left(\begin{array}{c}
 
n\\
 
n\\
 
k
 
k
 
\end{array}\right)p^{k}\left(1-p\right)^{n-k}</math>.  
 
\end{array}\right)p^{k}\left(1-p\right)^{n-k}</math>.  
  
<math>E\left[\mathbf{X}\right]=np</math>.  
+
<math class="inline">E\left[\mathbf{X}\right]=np</math>.  
  
<math>Var\left[\mathbf{X}\right]=np\left(1-p\right)=npq</math>.  
+
<math class="inline">Var\left[\mathbf{X}\right]=np\left(1-p\right)=npq</math>.  
  
 
Moment generating function
 
Moment generating function
  
The moment generating function for Binomial distribution must be <math>\phi_{\mathbf{X}}\left(s\right)=\left(p\cdot e^{s}+q\right)^{n}</math> because Binomial distribution is the <math>n</math>  convolution of Bernoulli distribution.
+
The moment generating function for Binomial distribution must be <math class="inline">\phi_{\mathbf{X}}\left(s\right)=\left(p\cdot e^{s}+q\right)^{n}</math> because Binomial distribution is the <math class="inline">n</math>  convolution of Bernoulli distribution.
  
 
Check
 
Check
  
<math>\phi_{\mathbf{X}}\left(s\right)=E\left[e^{s\mathbf{X}}\right]=\sum_{k=0}^{n}e^{s\cdot k}\left(\begin{array}{c}
+
<math class="inline">\phi_{\mathbf{X}}\left(s\right)=E\left[e^{s\mathbf{X}}\right]=\sum_{k=0}^{n}e^{s\cdot k}\left(\begin{array}{c}
 
n\\
 
n\\
 
k
 
k
Line 52: Line 52:
 
It is defined as the number of Bernoulli trials until success. Geometric random variable is memoryless.
 
It is defined as the number of Bernoulli trials until success. Geometric random variable is memoryless.
  
<math>p_{\mathbf{X}}\left(k\right)=q^{k-1}p</math>.  
+
<math class="inline">p_{\mathbf{X}}\left(k\right)=q^{k-1}p</math>.  
  
 
Moment generating function
 
Moment generating function
  
<math>\phi_{\mathbf{X}}\left(s\right)=E\left[e^{s\mathbf{X}}\right]=\sum_{k=1}^{\infty}e^{s\cdot k}q^{k-1}p=\frac{p}{q}\sum_{k=1}^{\infty}\left(e^{s}\cdot q\right)^{k}=\frac{p}{q}\cdot\frac{q\cdot e^{s}}{\left(1-q\cdot e^{s}\right)}=\frac{p\cdot e^{s}}{1-q\cdot e^{s}}</math>.  
+
<math class="inline">\phi_{\mathbf{X}}\left(s\right)=E\left[e^{s\mathbf{X}}\right]=\sum_{k=1}^{\infty}e^{s\cdot k}q^{k-1}p=\frac{p}{q}\sum_{k=1}^{\infty}\left(e^{s}\cdot q\right)^{k}=\frac{p}{q}\cdot\frac{q\cdot e^{s}}{\left(1-q\cdot e^{s}\right)}=\frac{p\cdot e^{s}}{1-q\cdot e^{s}}</math>.  
  
 
='''Probability generating function'''=
 
='''Probability generating function'''=
  
<math>P_{\mathbf{X}}\left(z\right)=E\left[z^{\mathbf{X}}\right]=\sum_{k=1}^{\infty}z^{k}\cdot q^{k-1}p=\frac{p}{q}\sum_{k=1}^{\infty}\left(z\cdot q\right)^{k}=\frac{p}{q}\cdot\frac{z\cdot q}{1-z\cdot q}=\frac{z\cdot p}{1-\cdot z\cdot q}</math>.  
+
<math class="inline">P_{\mathbf{X}}\left(z\right)=E\left[z^{\mathbf{X}}\right]=\sum_{k=1}^{\infty}z^{k}\cdot q^{k-1}p=\frac{p}{q}\sum_{k=1}^{\infty}\left(z\cdot q\right)^{k}=\frac{p}{q}\cdot\frac{z\cdot q}{1-z\cdot q}=\frac{z\cdot p}{1-\cdot z\cdot q}</math>.  
  
Approach 1 for <math>E\left[\mathbf{X}\right]</math>  and <math>Var\left[\mathbf{X}\right]</math>  
+
Approach 1 for <math class="inline">E\left[\mathbf{X}\right]</math>  and <math class="inline">Var\left[\mathbf{X}\right]</math>  
  
<math>E\left[\mathbf{X}\right]=\sum_{k=1}^{\infty}k\cdot q^{k-1}p=\frac{p}{q}\sum_{k=1}k\cdot q^{k}=\frac{p}{q}\cdot\frac{q}{\left(1-q\right)^{2}}=\frac{p}{p^{2}}=\frac{1}{p}</math>.  
+
<math class="inline">E\left[\mathbf{X}\right]=\sum_{k=1}^{\infty}k\cdot q^{k-1}p=\frac{p}{q}\sum_{k=1}k\cdot q^{k}=\frac{p}{q}\cdot\frac{q}{\left(1-q\right)^{2}}=\frac{p}{p^{2}}=\frac{1}{p}</math>.  
  
<math>E\left[\mathbf{X}^{2}\right]=\sum_{k=1}^{\infty}k^{2}\cdot q^{k-1}p=\frac{p}{q}\sum_{k=1}^{\infty}k^{2}\cdot q^{k}=\text{we cannot solve this equation!}</math>  
+
<math class="inline">E\left[\mathbf{X}^{2}\right]=\sum_{k=1}^{\infty}k^{2}\cdot q^{k-1}p=\frac{p}{q}\sum_{k=1}^{\infty}k^{2}\cdot q^{k}=\text{we cannot solve this equation!}</math>  
  
Approach 2 for <math>E\left[\mathbf{X}\right]</math>  and <math>Var\left[\mathbf{X}\right]</math>  
+
Approach 2 for <math class="inline">E\left[\mathbf{X}\right]</math>  and <math class="inline">Var\left[\mathbf{X}\right]</math>  
  
<math>E\left[\mathbf{X}\right]</math>  
+
<math class="inline">E\left[\mathbf{X}\right]</math>  
  
<math>E\left[\mathbf{X}^{2}\right]</math>  
+
<math class="inline">E\left[\mathbf{X}^{2}\right]</math>  
  
<math>Var\left[\mathbf{X}\right]=E\left[\mathbf{X}^{2}\right]-\left(E\left[\mathbf{X}\right]\right)^{2}=\frac{2-p}{p^{2}}-\frac{1}{p^{2}}=\frac{1-p}{p^{2}}</math>.  
+
<math class="inline">Var\left[\mathbf{X}\right]=E\left[\mathbf{X}^{2}\right]-\left(E\left[\mathbf{X}\right]\right)^{2}=\frac{2-p}{p^{2}}-\frac{1}{p^{2}}=\frac{1-p}{p^{2}}</math>.  
  
Approach 3 for <math>E\left[\mathbf{X}\right]</math>  and <math>Var\left[\mathbf{X}\right]</math>  
+
Approach 3 for <math class="inline">E\left[\mathbf{X}\right]</math>  and <math class="inline">Var\left[\mathbf{X}\right]</math>  
  
<math>E\left[\mathbf{X}\right]</math>  
+
<math class="inline">E\left[\mathbf{X}\right]</math>  
  
<math>E\left[\mathbf{X}^{2}\right]</math>  
+
<math class="inline">E\left[\mathbf{X}^{2}\right]</math>  
  
<math>Var\left[\mathbf{X}\right]=E\left[\mathbf{X}^{2}\right]-\left(E\left[\mathbf{X}\right]\right)^{2}=\frac{2-p}{p^{2}}-\frac{1}{p^{2}}=\frac{1-p}{p^{2}}</math>.  
+
<math class="inline">Var\left[\mathbf{X}\right]=E\left[\mathbf{X}^{2}\right]-\left(E\left[\mathbf{X}\right]\right)^{2}=\frac{2-p}{p^{2}}-\frac{1}{p^{2}}=\frac{1-p}{p^{2}}</math>.  
  
='''1.4.4 Poisson distribution when mean and variance are <math>\lambda</math>'''=  
+
='''1.4.4 Poisson distribution when mean and variance are <math class="inline">\lambda</math>'''=  
  
<math>p(k)=\frac{e^{-\lambda}\lambda^{k}}{k!}</math>.  
+
<math class="inline">p(k)=\frac{e^{-\lambda}\lambda^{k}}{k!}</math>.  
  
<math>\Phi_{\mathbf{X}}(\omega)=\sum_{k=0}^{\infty}\frac{e^{-\lambda}\lambda^{k}}{k!}\cdot e^{i\omega k}=e^{-\lambda}\sum_{k=0}^{\infty}\frac{\left(\lambda\cdot e^{i\omega}\right)^{k}}{k!}=e^{-\lambda}\cdot e^{\lambda e^{i\omega}}=e^{-\lambda\left(1-e^{i\omega}\right)}</math>.  
+
<math class="inline">\Phi_{\mathbf{X}}(\omega)=\sum_{k=0}^{\infty}\frac{e^{-\lambda}\lambda^{k}}{k!}\cdot e^{i\omega k}=e^{-\lambda}\sum_{k=0}^{\infty}\frac{\left(\lambda\cdot e^{i\omega}\right)^{k}}{k!}=e^{-\lambda}\cdot e^{\lambda e^{i\omega}}=e^{-\lambda\left(1-e^{i\omega}\right)}</math>.  
  
<math>E\left[\mathbf{X}\right]=Var\left[\mathbf{X}\right]=\lambda</math>.  
+
<math class="inline">E\left[\mathbf{X}\right]=Var\left[\mathbf{X}\right]=\lambda</math>.  
  
<math>E\left[\mathbf{X}^{2}\right]=Var\left[\mathbf{X}\right]+\left(E\left[\mathbf{X}\right]\right)^{2}=\lambda+\lambda^{2}</math>.  
+
<math class="inline">E\left[\mathbf{X}^{2}\right]=Var\left[\mathbf{X}\right]+\left(E\left[\mathbf{X}\right]\right)^{2}=\lambda+\lambda^{2}</math>.  
  
 
[[ECE 600 Prerequisites|Back to ECE 600 Prerequisites]]
 
[[ECE 600 Prerequisites|Back to ECE 600 Prerequisites]]

Revision as of 10:24, 30 November 2010

1.4 Discrete Random Variables

From the ECE600 Pre-requisites notes of Sangchun Han, ECE PhD student.


1.4.1 Bernoulli distribution

$ \mathbf{X}=\begin{cases} \begin{array}{lll} 1 & & ,\text{ if success}\\ 0 & & ,\text{ if fail.} \end{array}\end{cases} $

$ P\left(\left\{ \mathbf{X}=1\right\} \right)=p $.

$ P\left(\left\{ \mathbf{X}=0\right\} \right)=q\left(=1-p\right) $.

$ E\left[\mathbf{X}\right]=1\cdot P\left(\left\{ \mathbf{X}=1\right\} \right)+0\cdot P\left(\left\{ \mathbf{X}=0\right\} \right)=1\cdot p+0\cdot q=p $.

$ E\left[\mathbf{X}^{2}\right]=1^{2}\cdot P\left(\left\{ \mathbf{X}=1\right\} \right)+0^{2}\cdot P\left(\left\{ \mathbf{X}=0\right\} \right)=p $.

$ Var\left[\mathbf{X}\right]=E\left[\mathbf{X}^{2}\right]-\left(E\left[\mathbf{X}\right]\right)^{2}=p-p^{2}=p\left(1-p\right)=pq $.

Moment generating function

$ \phi_{\mathbf{X}}\left(s\right)=E\left[e^{s\mathbf{X}}\right]=e^{s\cdot1}\cdot p+e^{s\cdot0}\cdot q=p\cdot e^{s}+q $.

1.4.2 Binomial distribution

If $ \mathbf{Y}_{1},\mathbf{Y}_{2},\cdots $ are i.i.d. Bernoulli random variables, then Binomial random variable is defined as $ \mathbf{X}=\mathbf{Y}_{1}+\mathbf{Y}_{2}+\cdots+\mathbf{Y}_{n} $, which represents the number of success from $ n $ Bernoulli trials.

$ p_{\mathbf{X}}\left(k\right)=\left(\begin{array}{c} n\\ k \end{array}\right)p^{k}\left(1-p\right)^{n-k} $.

$ E\left[\mathbf{X}\right]=np $.

$ Var\left[\mathbf{X}\right]=np\left(1-p\right)=npq $.

Moment generating function

The moment generating function for Binomial distribution must be $ \phi_{\mathbf{X}}\left(s\right)=\left(p\cdot e^{s}+q\right)^{n} $ because Binomial distribution is the $ n $ convolution of Bernoulli distribution.

Check

$ \phi_{\mathbf{X}}\left(s\right)=E\left[e^{s\mathbf{X}}\right]=\sum_{k=0}^{n}e^{s\cdot k}\left(\begin{array}{c} n\\ k \end{array}\right)p^{k}\left(1-p\right)^{n-k}=\left(p\cdot e^{s}+\left(1-p\right)\right)^{n}=\left(p\cdot e^{s}+q\right)^{n}. $

1.4.3 Geometric distribution

It is defined as the number of Bernoulli trials until success. Geometric random variable is memoryless.

$ p_{\mathbf{X}}\left(k\right)=q^{k-1}p $.

Moment generating function

$ \phi_{\mathbf{X}}\left(s\right)=E\left[e^{s\mathbf{X}}\right]=\sum_{k=1}^{\infty}e^{s\cdot k}q^{k-1}p=\frac{p}{q}\sum_{k=1}^{\infty}\left(e^{s}\cdot q\right)^{k}=\frac{p}{q}\cdot\frac{q\cdot e^{s}}{\left(1-q\cdot e^{s}\right)}=\frac{p\cdot e^{s}}{1-q\cdot e^{s}} $.

Probability generating function

$ P_{\mathbf{X}}\left(z\right)=E\left[z^{\mathbf{X}}\right]=\sum_{k=1}^{\infty}z^{k}\cdot q^{k-1}p=\frac{p}{q}\sum_{k=1}^{\infty}\left(z\cdot q\right)^{k}=\frac{p}{q}\cdot\frac{z\cdot q}{1-z\cdot q}=\frac{z\cdot p}{1-\cdot z\cdot q} $.

Approach 1 for $ E\left[\mathbf{X}\right] $ and $ Var\left[\mathbf{X}\right] $

$ E\left[\mathbf{X}\right]=\sum_{k=1}^{\infty}k\cdot q^{k-1}p=\frac{p}{q}\sum_{k=1}k\cdot q^{k}=\frac{p}{q}\cdot\frac{q}{\left(1-q\right)^{2}}=\frac{p}{p^{2}}=\frac{1}{p} $.

$ E\left[\mathbf{X}^{2}\right]=\sum_{k=1}^{\infty}k^{2}\cdot q^{k-1}p=\frac{p}{q}\sum_{k=1}^{\infty}k^{2}\cdot q^{k}=\text{we cannot solve this equation!} $

Approach 2 for $ E\left[\mathbf{X}\right] $ and $ Var\left[\mathbf{X}\right] $

$ E\left[\mathbf{X}\right] $

$ E\left[\mathbf{X}^{2}\right] $

$ Var\left[\mathbf{X}\right]=E\left[\mathbf{X}^{2}\right]-\left(E\left[\mathbf{X}\right]\right)^{2}=\frac{2-p}{p^{2}}-\frac{1}{p^{2}}=\frac{1-p}{p^{2}} $.

Approach 3 for $ E\left[\mathbf{X}\right] $ and $ Var\left[\mathbf{X}\right] $

$ E\left[\mathbf{X}\right] $

$ E\left[\mathbf{X}^{2}\right] $

$ Var\left[\mathbf{X}\right]=E\left[\mathbf{X}^{2}\right]-\left(E\left[\mathbf{X}\right]\right)^{2}=\frac{2-p}{p^{2}}-\frac{1}{p^{2}}=\frac{1-p}{p^{2}} $.

1.4.4 Poisson distribution when mean and variance are $ \lambda $

$ p(k)=\frac{e^{-\lambda}\lambda^{k}}{k!} $.

$ \Phi_{\mathbf{X}}(\omega)=\sum_{k=0}^{\infty}\frac{e^{-\lambda}\lambda^{k}}{k!}\cdot e^{i\omega k}=e^{-\lambda}\sum_{k=0}^{\infty}\frac{\left(\lambda\cdot e^{i\omega}\right)^{k}}{k!}=e^{-\lambda}\cdot e^{\lambda e^{i\omega}}=e^{-\lambda\left(1-e^{i\omega}\right)} $.

$ E\left[\mathbf{X}\right]=Var\left[\mathbf{X}\right]=\lambda $.

$ E\left[\mathbf{X}^{2}\right]=Var\left[\mathbf{X}\right]+\left(E\left[\mathbf{X}\right]\right)^{2}=\lambda+\lambda^{2} $.

Back to ECE 600 Prerequisites

Alumni Liaison

To all math majors: "Mathematics is a wonderfully rich subject."

Dr. Paul Garrett