Line 1: Line 1:
 +
==Maximum Likelihood Estimation (ML)==
 +
 +
==Maximum A-Posteriori Estimation (MAP)==
 +
 +
==Minimum Mean-Square Estimation (MMSE)==
 +
 
<math>{y}_{\rm MMSE}(x) \int\limits_{-inf}^{inf}\ {y}{f}_{\rm y|x}(Y|X=x)\, dy={E}(Y|X=x)</math>
 
<math>{y}_{\rm MMSE}(x) \int\limits_{-inf}^{inf}\ {y}{f}_{\rm y|x}(Y|X=x)\, dy={E}(Y|X=x)</math>
  
Line 6: Line 12:
  
 
Mean square estimate : <math>MSE = E[(\theta - \hat \theta(x))^2]
 
Mean square estimate : <math>MSE = E[(\theta - \hat \theta(x))^2]
 +
 +
==Linear Minimum Mean-Square Estimation (LMMSE)==
 +
 +
==Hypothesis Testing: ML Rule==
 +
 +
Type I error
 +
 +
Type II error
 +
 +
==Hypothesis Testing: MAP Rule==
 +
 +
Overall P(err)

Revision as of 15:31, 11 December 2008

Maximum Likelihood Estimation (ML)

Maximum A-Posteriori Estimation (MAP)

Minimum Mean-Square Estimation (MMSE)

$ {y}_{\rm MMSE}(x) \int\limits_{-inf}^{inf}\ {y}{f}_{\rm y|x}(Y|X=x)\, dy={E}(Y|X=x) $


$ {y}_{\rm LMMSE}(x)=E[\theta]+\frac{COV(x,\theta)}{Var(x)}*(x-E[x]) $


Mean square estimate : $ MSE = E[(\theta - \hat \theta(x))^2] ==Linear Minimum Mean-Square Estimation (LMMSE)== ==Hypothesis Testing: ML Rule== Type I error Type II error ==Hypothesis Testing: MAP Rule== Overall P(err) $

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett