(New page: *ECE 600 Prerequisites) |
|||
Line 1: | Line 1: | ||
+ | ='''1.5 Poisson Process'''= | ||
+ | |||
+ | There are Bernoulli trials at every <math>dt</math> and the probability of success is <math>\lambda dt</math> . If <math>\mathbf{N}\left(t\right)</math> is the number of success in time <math>t</math> , the set <math>\left\{ \mathbf{N}\left(t\right),\; t\geq0\right\}</math> is the Poinsson process. The distribution of <math>\mathbf{N}\left(t\right)</math> is gained by insering <math>p=\frac{\lambda t}{n}</math> and applying <math>n\rightarrow\infty</math> . <math>P\left(\left\{ \mathbf{N}\left(t\right)=y\right\} \right)=\lim_{n\rightarrow\infty}\left(\begin{array}{c} n\\ | ||
+ | y | ||
+ | \end{array}\right)\left(\frac{\lambda t}{n}\right)^{y}\left(1-\frac{\lambda t}{n}\right)^{n-y}=\frac{\left(\lambda t\right)^{y}e^{-\lambda t}}{y!},\qquad y=0,1,2,\cdots</math>. | ||
+ | |||
+ | '''Moment generating function''' | ||
+ | |||
+ | <math>\phi_{\mathbf{N}\left(t\right)}\left(s\right)=E\left[e^{s\cdot\mathbf{N}\left(t\right)}\right]=\sum_{k=0}^{\infty}e^{s\cdot k}\cdot\frac{\left(\lambda t\right)^{k}e^{-\lambda t}}{k!}=e^{-\lambda t}\sum_{k=0}^{\infty}\frac{\left(e^{s}\cdot\lambda t\right)^{k}}{k!}=e^{-\lambda t}\cdot e^{\lambda t\cdot e^{s}}=e^{-\lambda t\left(1-e^{s}\right)}.</math> | ||
+ | |||
+ | '''Example''' | ||
+ | |||
+ | Two independent Poisson process <math>\mathbf{N}_{1}\left(t\right)</math> and <math>\mathbf{N}_{2}\left(t\right)</math> have the mean <math>\lambda_{1}t</math> and <math>\lambda_{2}t</math> respectively. Find the distribution of <math>\mathbf{N}_{1}\left(t\right)+\mathbf{N}_{2}\left(t\right)</math> . | ||
+ | |||
+ | '''Solution''' | ||
+ | |||
+ | <math>\phi_{\mathbf{N}_{1}\left(t\right)}\left(s\right)=e^{-\lambda_{1}t\left(1-e^{s}\right)}\text{ and }\phi_{\mathbf{N}_{2}\left(t\right)}\left(s\right)=e^{-\lambda_{2}t\left(1-e^{s}\right)}</math>. | ||
+ | |||
+ | <math>\phi_{\mathbf{N}_{1}\left(t\right)+\mathbf{N}_{2}\left(t\right)}\left(s\right)</math> | ||
+ | |||
+ | Thus, <math>\mathbf{N}_{1}\left(t\right)+\mathbf{N}_{2}\left(t\right)</math> is a Poisson process with the mean <math>\left(\lambda_{1}+\lambda_{2}\right)t</math> . | ||
+ | |||
+ | '''Superposition''' | ||
+ | |||
+ | <math>\left\{ \mathbf{N}_{i}\left(t\right),\; t\geq0\right\}</math> is <math>PP\left(\lambda_{i}\right),\; i=1,2,\cdots,n</math> and independent each other. If <math>\mathbf{N}\left(t\right)=\sum_{i=1}^{n}\mathbf{N}_{i}\left(t\right)</math> and <math>\lambda=\sum_{i=1}^{n}\lambda_{i}</math> , then <math>\left\{ \mathbf{N}\left(t\right),\; t\geq0\right\}</math> is <math>PP\left(\lambda\right)</math> . | ||
+ | |||
+ | '''Compound Poisson process''' | ||
+ | |||
+ | <math>\mathbf{Y}_{1},\mathbf{Y}_{2},\cdots</math> are i.i.d. random variables. <math>\mathbf{N}\left(t\right)</math> is Poisson process <math>\left(\lambda\right)</math> and is independent with <math>\mathbf{Y}</math> s. Then, the set <math>\left\{ \mathbf{X}\left(t\right),\; t\geq0\right\}</math> is the compound Poisson process where <math>\mathbf{X}\left(t\right)=\mathbf{Y}_{1}+\mathbf{Y}_{2}+\cdots+\mathbf{Y}_{\mathbf{N}\left(t\right)}</math> . | ||
+ | |||
+ | <math>E\left[\mathbf{X}\left(t\right)\right]=\sum_{i=1}^{E\left[\mathbf{N}\left(t\right)\right]}E\left[\mathbf{Y}_{i}\right]=E\left[\mathbf{N}\left(t\right)\right]E\left[\mathbf{Y}\right]=\lambda t\cdot E\left[\mathbf{Y}\right]</math>. | ||
+ | |||
+ | <math>Var\left[\mathbf{X}\left(t\right)\right]=\lambda t\cdot E\left[\mathbf{Y}^{2}\right]</math>. | ||
+ | |||
+ | The necessary condition for <math>\mathbf{X}\left(t\right)</math> to become Poisson process is <math>E\left[\mathbf{Y}\right]=E\left[\mathbf{Y}^{2}\right]</math> . | ||
+ | |||
+ | If <math>\mathbf{Y}</math> s are Poisson process <math>\left(\lambda_{Y}\right)</math> | ||
+ | |||
+ | The probability generating function for <math>\mathbf{N}\left(t\right)</math> is <math>P_{N\left(t\right)}\left(z\right)=E\left[z^{N\left(t\right)}\right]=\sum_{k=0}^{\infty}z^{k}\cdot\frac{e^{-\lambda t}\cdot\left(\lambda t\right)^{k}}{k!}=e^{-\lambda t}\cdot\sum_{k=0}^{\infty}\frac{\left(z\lambda t\right)^{k}}{k!}=e^{-\lambda t}\cdot e^{z\lambda t}=e^{-\lambda t\left(1-z\right)}</math>. | ||
+ | |||
+ | The moment generating function for Y s is<math>\phi_{Y}\left(s\right)=E\left[e^{sY}\right]=\sum_{k=0}^{\infty}e^{sk}\cdot\frac{e^{-\lambda_{Y}t}\cdot\left(\lambda_{Y}t\right)^{k}}{k!}=e^{-\lambda_{Y}t\left(1-e^{s}\right)}</math>. | ||
+ | |||
+ | Now, the moment generating function for <math>X\left(t\right)</math> is | ||
+ | |||
+ | <math>\phi_{X}\left(s\right)=P_{N\left(t\right)}\left(z\right)\biggl|_{z=\phi_{Y}\left(s\right)}=e^{-\lambda t\left(1-z\right)}\biggl|_{z=e^{-\lambda_{Y}t\left(1-e^{s}\right)}}=e^{-\lambda t\left(1-e^{-\lambda_{Y}t\left(1-e^{s}\right)}\right)}.</math> | ||
+ | |||
*[[ECE 600 Prerequisites|ECE 600 Prerequisites]] | *[[ECE 600 Prerequisites|ECE 600 Prerequisites]] |
Revision as of 12:42, 16 November 2010
1.5 Poisson Process
There are Bernoulli trials at every $ dt $ and the probability of success is $ \lambda dt $ . If $ \mathbf{N}\left(t\right) $ is the number of success in time $ t $ , the set $ \left\{ \mathbf{N}\left(t\right),\; t\geq0\right\} $ is the Poinsson process. The distribution of $ \mathbf{N}\left(t\right) $ is gained by insering $ p=\frac{\lambda t}{n} $ and applying $ n\rightarrow\infty $ . $ P\left(\left\{ \mathbf{N}\left(t\right)=y\right\} \right)=\lim_{n\rightarrow\infty}\left(\begin{array}{c} n\\ y \end{array}\right)\left(\frac{\lambda t}{n}\right)^{y}\left(1-\frac{\lambda t}{n}\right)^{n-y}=\frac{\left(\lambda t\right)^{y}e^{-\lambda t}}{y!},\qquad y=0,1,2,\cdots $.
Moment generating function
$ \phi_{\mathbf{N}\left(t\right)}\left(s\right)=E\left[e^{s\cdot\mathbf{N}\left(t\right)}\right]=\sum_{k=0}^{\infty}e^{s\cdot k}\cdot\frac{\left(\lambda t\right)^{k}e^{-\lambda t}}{k!}=e^{-\lambda t}\sum_{k=0}^{\infty}\frac{\left(e^{s}\cdot\lambda t\right)^{k}}{k!}=e^{-\lambda t}\cdot e^{\lambda t\cdot e^{s}}=e^{-\lambda t\left(1-e^{s}\right)}. $
Example
Two independent Poisson process $ \mathbf{N}_{1}\left(t\right) $ and $ \mathbf{N}_{2}\left(t\right) $ have the mean $ \lambda_{1}t $ and $ \lambda_{2}t $ respectively. Find the distribution of $ \mathbf{N}_{1}\left(t\right)+\mathbf{N}_{2}\left(t\right) $ .
Solution
$ \phi_{\mathbf{N}_{1}\left(t\right)}\left(s\right)=e^{-\lambda_{1}t\left(1-e^{s}\right)}\text{ and }\phi_{\mathbf{N}_{2}\left(t\right)}\left(s\right)=e^{-\lambda_{2}t\left(1-e^{s}\right)} $.
$ \phi_{\mathbf{N}_{1}\left(t\right)+\mathbf{N}_{2}\left(t\right)}\left(s\right) $
Thus, $ \mathbf{N}_{1}\left(t\right)+\mathbf{N}_{2}\left(t\right) $ is a Poisson process with the mean $ \left(\lambda_{1}+\lambda_{2}\right)t $ .
Superposition
$ \left\{ \mathbf{N}_{i}\left(t\right),\; t\geq0\right\} $ is $ PP\left(\lambda_{i}\right),\; i=1,2,\cdots,n $ and independent each other. If $ \mathbf{N}\left(t\right)=\sum_{i=1}^{n}\mathbf{N}_{i}\left(t\right) $ and $ \lambda=\sum_{i=1}^{n}\lambda_{i} $ , then $ \left\{ \mathbf{N}\left(t\right),\; t\geq0\right\} $ is $ PP\left(\lambda\right) $ .
Compound Poisson process
$ \mathbf{Y}_{1},\mathbf{Y}_{2},\cdots $ are i.i.d. random variables. $ \mathbf{N}\left(t\right) $ is Poisson process $ \left(\lambda\right) $ and is independent with $ \mathbf{Y} $ s. Then, the set $ \left\{ \mathbf{X}\left(t\right),\; t\geq0\right\} $ is the compound Poisson process where $ \mathbf{X}\left(t\right)=\mathbf{Y}_{1}+\mathbf{Y}_{2}+\cdots+\mathbf{Y}_{\mathbf{N}\left(t\right)} $ .
$ E\left[\mathbf{X}\left(t\right)\right]=\sum_{i=1}^{E\left[\mathbf{N}\left(t\right)\right]}E\left[\mathbf{Y}_{i}\right]=E\left[\mathbf{N}\left(t\right)\right]E\left[\mathbf{Y}\right]=\lambda t\cdot E\left[\mathbf{Y}\right] $.
$ Var\left[\mathbf{X}\left(t\right)\right]=\lambda t\cdot E\left[\mathbf{Y}^{2}\right] $.
The necessary condition for $ \mathbf{X}\left(t\right) $ to become Poisson process is $ E\left[\mathbf{Y}\right]=E\left[\mathbf{Y}^{2}\right] $ .
If $ \mathbf{Y} $ s are Poisson process $ \left(\lambda_{Y}\right) $
The probability generating function for $ \mathbf{N}\left(t\right) $ is $ P_{N\left(t\right)}\left(z\right)=E\left[z^{N\left(t\right)}\right]=\sum_{k=0}^{\infty}z^{k}\cdot\frac{e^{-\lambda t}\cdot\left(\lambda t\right)^{k}}{k!}=e^{-\lambda t}\cdot\sum_{k=0}^{\infty}\frac{\left(z\lambda t\right)^{k}}{k!}=e^{-\lambda t}\cdot e^{z\lambda t}=e^{-\lambda t\left(1-z\right)} $.
The moment generating function for Y s is$ \phi_{Y}\left(s\right)=E\left[e^{sY}\right]=\sum_{k=0}^{\infty}e^{sk}\cdot\frac{e^{-\lambda_{Y}t}\cdot\left(\lambda_{Y}t\right)^{k}}{k!}=e^{-\lambda_{Y}t\left(1-e^{s}\right)} $.
Now, the moment generating function for $ X\left(t\right) $ is
$ \phi_{X}\left(s\right)=P_{N\left(t\right)}\left(z\right)\biggl|_{z=\phi_{Y}\left(s\right)}=e^{-\lambda t\left(1-z\right)}\biggl|_{z=e^{-\lambda_{Y}t\left(1-e^{s}\right)}}=e^{-\lambda t\left(1-e^{-\lambda_{Y}t\left(1-e^{s}\right)}\right)}. $