(New page: ='''1.6 Continuous Random Variables'''= '''1.6.1 Gaussian distribution (normal distribution)''' <math>\mathcal{N}\left(\mu,\sigma^{2}\right)</math> <math>f_{\mathbf{X}}(x)=\frac{1}{\sqr...) |
|||
Line 13: | Line 13: | ||
<math>Var\left[\mathbf{X}\right]=\sigma^{2}</math> | <math>Var\left[\mathbf{X}\right]=\sigma^{2}</math> | ||
− | '''1.6.2 Log-normal distribution \ln\mathcal{N}\left(\mu,\sigma^{2}\right)''' | + | '''1.6.2 Log-normal distribution <math>\ln\mathcal{N}\left(\mu,\sigma^{2}\right)</math>''' |
For log-normal distribution, its logaritm is normally distributed. If \mathbf{X} is a random variable with log-normal distribution, then <math>\mathbf{Y}=\ln\mathbf{X}</math> is a random variable with Gaussian distribution. This distribution is characterized with two parameters; <math>\mu</math> and <math>\sigma</math> (that are mean and standard deviation of <math>\mathbf{Y}</math> rather than <math>\mathbf{X}</math> ). | For log-normal distribution, its logaritm is normally distributed. If \mathbf{X} is a random variable with log-normal distribution, then <math>\mathbf{Y}=\ln\mathbf{X}</math> is a random variable with Gaussian distribution. This distribution is characterized with two parameters; <math>\mu</math> and <math>\sigma</math> (that are mean and standard deviation of <math>\mathbf{Y}</math> rather than <math>\mathbf{X}</math> ). |
Revision as of 05:28, 17 November 2010
1.6 Continuous Random Variables
1.6.1 Gaussian distribution (normal distribution) $ \mathcal{N}\left(\mu,\sigma^{2}\right) $
$ f_{\mathbf{X}}(x)=\frac{1}{\sqrt{2\pi}\sigma}\cdot e^{-\frac{(x-\mu)^{2}}{2\sigma^{2}}} $
$ \Phi_{\mathbf{X}}\left(\omega\right)=e^{i\mu\omega}e^{-\frac{1}{2}\sigma^{2}\omega^{2}} $
$ \phi_{\mathbf{X}}\left(s\right)=e^{\mu s}e^{\frac{1}{2}\sigma^{2}s^{2}} $
$ E\left[\mathbf{X}\right]=\mu $
$ Var\left[\mathbf{X}\right]=\sigma^{2} $
1.6.2 Log-normal distribution $ \ln\mathcal{N}\left(\mu,\sigma^{2}\right) $
For log-normal distribution, its logaritm is normally distributed. If \mathbf{X} is a random variable with log-normal distribution, then $ \mathbf{Y}=\ln\mathbf{X} $ is a random variable with Gaussian distribution. This distribution is characterized with two parameters; $ \mu $ and $ \sigma $ (that are mean and standard deviation of $ \mathbf{Y} $ rather than $ \mathbf{X} $ ).
$ f_{\mathbf{X}}\left(x\right)=\frac{1}{x\sqrt{2\pi}\sigma}\cdot e^{-\frac{(\ln x-\mu)^{2}}{2\sigma^{2}}} $.
$ E\left[\mathbf{X}\right]=e^{\mu+\sigma^{2}/2} $.
$ Var\left[\mathbf{X}\right]=\left(e^{\sigma^{2}}-1\right)e^{2\mu+\sigma^{2}} $.
MLE of log-normal distribution
$ \hat{\mu}\frac{\sum_{k}\ln x_{k}}{n} $.
$ \hat{\sigma}^{2}=\frac{\sum_{k}\left(\ln x_{k}-\hat{\mu}\right)^{2}}{n} $.
1.6.3 Exponential distribution with mean $ \mu $
It is about the time until success in Poisson process. It has the characteristic of memoryless.
$ F_{\mathbf{X}}\left(x\right)=\left(1-e^{-\frac{x}{\mu}}\right)\cdot\mathbf{1}_{\left[0,\infty\right)}\left(x\right) $.
$ f_{\mathbf{X}}\left(x\right)=\frac{1}{\mu}e^{-\frac{x}{\mu}}\cdot\mathbf{1}_{\left[0,\infty\right)}\left(x\right) $.
$ \Phi_{\mathbf{X}}\left(\omega\right)=\int_{0}^{\infty}\frac{1}{\mu}e^{-\frac{x}{\mu}}\cdot e^{i\omega x}dx=\frac{1}{\mu}\int_{0}^{\infty}e^{x\left(i\omega-\frac{1}{\mu}\right)}dx=\frac{1}{\mu}\cdot\frac{e^{x\left(i\omega-\frac{1}{\mu}\right)}}{i\omega-\frac{1}{\mu}}\Biggl|_{0}^{\infty}=\frac{\frac{1}{\mu}}{\frac{1}{\mu}-i\omega}=\frac{1}{1-i\mu\omega} $.
$ E\left[\mathbf{X}\right]=\mu $.
$ Var\left[\mathbf{X}\right]=\mu^{2} $.
Consider Poisson process
$ f_{\mathbf{X}}\left(t\right)dt $ In fact, $ \lambda=1/\mu $ .
Moment generating function
$ \phi_{\mathbf{X}}\left(s\right)=E\left[e^{s\mathbf{X}}\right]=\int_{0}^{\infty}e^{sx}\cdot\lambda e^{-\lambda x}dx=\lambda\int_{0}^{\infty}e^{x\left(s-\lambda\right)}dx=\frac{\lambda}{s-\lambda}e^{x\left(s-\lambda\right)}\biggl|_{0}^{\infty}=\frac{\lambda}{\lambda-s}=\frac{1}{1-s\cdot\mu} $.
1.6.4 Erlang distribution
If $ \mathbf{X}_{1},\mathbf{X}_{2},\cdots $ are i.i.d. exponential random variables, then Erlang random variable is defined as $ \mathbf{Z}_{m}=\mathbf{X}_{1}+\mathbf{X}_{2}+\cdots+\mathbf{X}_{m} $. This is about the the time until m -th success. $ f_{\mathbf{Z}_{m}}\left(t\right)=\frac{\left(\lambda t\right)^{m-1}e^{-\lambda t}}{\left(m-1\right)!}\cdot\lambda,\quad t\geq0 $.
$ E\left[\mathbf{Z}_{m}\right]=E\left[\mathbf{X}_{1}+\mathbf{X}_{2}+\cdots+\mathbf{X}_{m}\right]=\sum_{k=1}^{m}E\left[\mathbf{X}_{k}\right]=m\cdot\frac{1}{\lambda}=\frac{m}{\lambda} $.
$ Var\left[\mathbf{Z}_{m}\right]=Var\left[\mathbf{X}_{1}+\mathbf{X}_{2}+\cdots+\mathbf{X}_{m}\right]=\sum_{k=1}^{m}Var\left[\mathbf{X}_{k}\right]=m\cdot\frac{1}{\lambda^{2}}=\frac{m}{\lambda^{2}} $. There are one important relationship that is useful to solve several problems.$ P\left(\mathbf{Z}_{m}\leq t\right)=P\left(\mathbf{N}\left(t\right)\geq m\right) $.
Moment generating function
$ \phi_{\mathbf{Z}_{m}}\left(s\right)=E\left[e^{s\mathbf{Z}_{m}}\right]=E\left[e^{s\left(\mathbf{X}_{1}+\mathbf{X}_{2}+\cdots+\mathbf{X}_{m}\right)}\right]=\left(\phi_{\mathbf{X}}\left(s\right)\right)^{m}=\left(\frac{\lambda}{\lambda-s}\right)^{m} $.
Gamma distribution
The $ m $ is the natural number in Erlang distribution. If $ m $ is the positive real number, then it becomes Gamma distribution. But we also change $ \left(m-1\right) $! into Gamma function $ \Gamma\left(m\right)=\int_{0}^{\infty}e^{x}x^{m-1}dx $ .