(Moved over from old Kiwi)
 
 
Line 1: Line 1:
 +
[[Category:ECE662]]
 +
[[Category:decision theory]]
 +
[[Category:pattern recognition]]
 +
[[Category:maximum likelihood estimation]]
 +
 +
=Maximum Likelihood Estimation (MLE) example: Exponential and Geometric Distributions=
 +
Link to other examples: [[MLE_Examples:_Binomial_and_Poisson_Distributions_OldKiwi|Binomial and Poisson distributions]]
 +
----
 +
 
'''Exponential Distribution'''
 
'''Exponential Distribution'''
  
Line 59: Line 68:
  
 
This agrees with the intuition because, in n observations of a geometric random variable, there are n successes in the <math>\sum_{1}^{n}{X}_{i}</math> trials. Thus the estimate of p is the number of successes divided by the total number of trials.
 
This agrees with the intuition because, in n observations of a geometric random variable, there are n successes in the <math>\sum_{1}^{n}{X}_{i}</math> trials. Thus the estimate of p is the number of successes divided by the total number of trials.
 +
----
 +
[[ECE662|Back to ECE662]]

Latest revision as of 09:00, 23 April 2012


Maximum Likelihood Estimation (MLE) example: Exponential and Geometric Distributions

Link to other examples: Binomial and Poisson distributions


Exponential Distribution

Let $ {X}_{1}, {X}_{2}, {X}_{3}.....{X}_{n} $ be a random sample from the exponential distribution with p.d.f.

$ f(x;\theta)=\frac{1}{\theta}{e}^{\frac{-x}{\theta}} 0<x<\infty, \theta\in\Omega=\{\theta|0<\theta<\infty\} $

The likelihood function is given by:

$ L(\theta)=L\left(\theta;{x}_{1},{x}_{2}...{x}_{n} \right)=\left(\frac{1}{\theta}{e}^{\frac{{-x}_{1}}{\theta}}\right)\left(\frac{1}{\theta}{e}^{\frac{{-x}_{2}}{\theta}}\right)...\left(\frac{1}{\theta}{e}^{\frac{{-x}_{n}}{\theta}} \right)=\frac{1}{{\theta}^{n}}exp\left(\frac{-\sum_{1}^{n}{x}_{i}}{\theta} \right) $

Taking log, we get,

$ lnL\left(\theta\right)=-\left(n \right)ln\left(\theta\right) -\frac{1}{\theta}\sum_{1}^{n}{x}_{i}, 0<\theta<\infty $

Differentiating the above expression, and equating to zero, we get

$ \frac{d\left[lnL\left(\theta\right) \right]}{d\theta}=\frac{-\left(n \right)}{\left(\theta\right)} +\frac{1}{{\theta}^{2}}\sum_{1}^{n}{x}_{i}=0 $

The solution of equation for $ \theta $ is:

$ \theta=\frac{\sum_{1}^{n}{x}_{i}}{n} $

Thus, the maximum likelihood estimator of $ \Theta $ is

$ \Theta=\frac{\sum_{1}^{n}{X}_{i}}{n} $


Geometric Distribution

Let $ {X}_{1}, {X}_{2}, {X}_{3}.....{X}_{n} $ be a random sample from the geometric distribution with p.d.f.

$ f\left(x;p \right)={\left(1-p \right)}^{x-1}p, x=1,2,3.... $

The likelihood function is given by:

$ L\left(p \right)={\left(1-p \right)}^{{x}_{1}-1}p {\left(1-p \right)}^{{x}_{2}-1}p...{\left(1-p \right)}^{{x}_{n}-1}p ={p}^{n}{\left(1-p \right)}^{\sum_{1}^{n}{x}_{i}-n} $

Taking log,

$ lnL\left(p \right)= nln{p}+\left(\sum_{1}^{n}{x}_{i}-n \right)ln{\left(1-p \right)} $

Differentiating and equating to zero, we get,

$ \frac{d\left[lnL\left(p \right)\right]}{dp}=\frac{n}{p} -\frac{\left(\sum_{1}^{n}{x}_{i}-n \right)}{\left(1-p \right)}=0 $

Therefore,

$ p=\frac{n}{\left(\sum_{1}^{n}{x}_{i} \right)} $

So, the maximum likelihood estimator of P is:

$ P=\frac{n}{\left(\sum_{1}^{n}{X}_{i} \right)}=\frac{1}{X} $

This agrees with the intuition because, in n observations of a geometric random variable, there are n successes in the $ \sum_{1}^{n}{X}_{i} $ trials. Thus the estimate of p is the number of successes divided by the total number of trials.


Back to ECE662

Alumni Liaison

ECE462 Survivor

Seraj Dosenbach