Revision as of 09:21, 31 March 2008 by Pclough (Talk)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Exponential Distribution

Let $ {X}_{1}, {X}_{2}, {X}_{3}.....{X}_{n} $ be a random sample from the exponential distribution with p.d.f.

$ f(x;\theta)=\frac{1}{\theta}{e}^{\frac{-x}{\theta}} 0<x<\infty, \theta\in\Omega=\{\theta|0<\theta<\infty\} $

The likelihood function is given by:

$ L(\theta)=L\left(\theta;{x}_{1},{x}_{2}...{x}_{n} \right)=\left(\frac{1}{\theta}{e}^{\frac{{-x}_{1}}{\theta}}\right)\left(\frac{1}{\theta}{e}^{\frac{{-x}_{2}}{\theta}}\right)...\left(\frac{1}{\theta}{e}^{\frac{{-x}_{n}}{\theta}} \right)=\frac{1}{{\theta}^{n}}exp\left(\frac{-\sum_{1}^{n}{x}_{i}}{\theta} \right) $

Taking log, we get,

$ lnL\left(\theta\right)=-\left(n \right)ln\left(\theta\right) -\frac{1}{\theta}\sum_{1}^{n}{x}_{i}, 0<\theta<\infty $

Differentiating the above expression, and equating to zero, we get

$ \frac{d\left[lnL\left(\theta\right) \right]}{d\theta}=\frac{-\left(n \right)}{\left(\theta\right)} +\frac{1}{{\theta}^{2}}\sum_{1}^{n}{x}_{i}=0 $

The solution of equation for $ \theta $ is:

$ \theta=\frac{\sum_{1}^{n}{x}_{i}}{n} $

Thus, the maximum likelihood estimator of $ \Theta $ is

$ \Theta=\frac{\sum_{1}^{n}{X}_{i}}{n} $


Geometric Distribution

Let $ {X}_{1}, {X}_{2}, {X}_{3}.....{X}_{n} $ be a random sample from the geometric distribution with p.d.f.

$ f\left(x;p \right)={\left(1-p \right)}^{x-1}p, x=1,2,3.... $

The likelihood function is given by:

$ L\left(p \right)={\left(1-p \right)}^{{x}_{1}-1}p {\left(1-p \right)}^{{x}_{2}-1}p...{\left(1-p \right)}^{{x}_{n}-1}p ={p}^{n}{\left(1-p \right)}^{\sum_{1}^{n}{x}_{i}-n} $

Taking log,

$ lnL\left(p \right)= nln{p}+\left(\sum_{1}^{n}{x}_{i}-n \right)ln{\left(1-p \right)} $

Differentiating and equating to zero, we get,

$ \frac{d\left[lnL\left(p \right)\right]}{dp}=\frac{n}{p} -\frac{\left(\sum_{1}^{n}{x}_{i}-n \right)}{\left(1-p \right)}=0 $

Therefore,

$ p=\frac{n}{\left(\sum_{1}^{n}{x}_{i} \right)} $

So, the maximum likelihood estimator of P is:

$ P=\frac{n}{\left(\sum_{1}^{n}{X}_{i} \right)}=\frac{1}{X} $

This agrees with the intuition because, in n observations of a geometric random variable, there are n successes in the $ \sum_{1}^{n}{X}_{i} $ trials. Thus the estimate of p is the number of successes divided by the total number of trials.

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang