(Linear Minimum Mean-Square Estimation (LMMSE))
(ML Rule)
Line 49: Line 49:
 
Given a value of X, we will say H1 is true if X is in region R, else will will say H0 is true.
 
Given a value of X, we will say H1 is true if X is in region R, else will will say H0 is true.
  
'''Type I error'''
+
'''Type I Error: False Rejection'''
  
 
Say <math>H_1</math> when truth is <math>H_0</math>. Probability of this is:  
 
Say <math>H_1</math> when truth is <math>H_0</math>. Probability of this is:  
 
:<math>Pr(\mbox{Say } H_1|H_0) = Pr(x \in R|\theta_0)</math>
 
:<math>Pr(\mbox{Say } H_1|H_0) = Pr(x \in R|\theta_0)</math>
  
'''Type II error'''
+
'''Type II Error: False Acceptance'''
  
 
Say <math>H_0</math> when truth is <math>H_1</math>. Probability of this is:  
 
Say <math>H_0</math> when truth is <math>H_1</math>. Probability of this is:  

Revision as of 16:42, 13 December 2008

Maximum Likelihood Estimation (ML)

$ \hat a_{ML} = \overset{max}{a} f_{X}(x_i;a) $ continuous
$ \hat a_{ML} = \overset{max}{a} Pr(x_i;a) $ discrete

Maximum A-Posteriori Estimation (MAP)

$ \hat \theta_{MAP}(x) = \text{arg }\overset{max}{\theta} P_{X|\theta}(x|\theta)P_ {\theta}(\theta) $
$ \hat \theta_{MAP}(x) = \text{arg }\overset{max}{\theta} f_{X|\theta}(x|\theta)P_ {\theta}(\theta) $

Minimum Mean-Square Estimation (MMSE)

$ \hat{y}_{\rm MMSE}(x) = \int_{-\infty}^{\infty} {y}{f}_{\rm Y|X}(y|x)\, dy={E}[Y|X=x] $

Law Of Iterated Expectation

$ E[E[X|Y]] = \begin{cases} \sum_{y} E[X|Y = y]p_Y(y),\,\,\,\,\,\,\,\,\,\,\mbox{ Y discrete,}\\ \int_{-\infty}^{+\infty} E[X|Y = y]f_Y(y)\,dy,\mbox{ Y continuous.} \end{cases} $

Using the total expectation theorem:

$ E\Big[ E[X|Y]] = E[X] $

Mean Square Error

$ MSE = E[(\Theta - \hat \theta(x))^2] $

Linear Minimum Mean-Square Estimation (LMMSE)

The LMMS estimator \hat{Y} of Y based on the variable X is

$ \hat{Y}_{LMMSE}(x) = E[Y]+\frac{COV(Y,X)}{Var(X)}(X-E[X]) = E[Y] + \rho \frac{\sigma_{Y}}{\sigma_{X}}(X-E[X]) $

where

$ \rho = \frac{COV(Y,X)}{\sigma_{Y}\sigma_{X}} $

Law of Iterated Expectation: E[E[X|Y]]=E[X]

Hypothesis Testing

In hypothesis testing $ \Theta $ takes on one of m values, $ \theta_1,...,\theta_m $ where m is usually small; often m = 2, in which case it is a binary hypthothesis testing problem.

The event $ \Theta = \theta_i $ is the $ i^{th} $ hypothesis denoted by $ H_i $

ML Rule

Given a value of X, we will say H1 is true if X is in region R, else will will say H0 is true.

Type I Error: False Rejection

Say $ H_1 $ when truth is $ H_0 $. Probability of this is:

$ Pr(\mbox{Say } H_1|H_0) = Pr(x \in R|\theta_0) $

Type II Error: False Acceptance

Say $ H_0 $ when truth is $ H_1 $. Probability of this is:

$ Pr(\mbox{Say }H_0|H_1) = Pr(x \in R^C|\theta_1) $

MAP Rule

$ \mbox{Overall P(err)} = P_{\theta}(\theta_{0})Pr\Big[\mbox{Say }H_{1}|H_{0}\Big] +P_{\theta}(\theta_{1})Pr\Big[\mbox{Say }H_{0}|H_{1}\Big] $

Likelihood Ratio Test

How to find a good rule? --Khosla 16:44, 13 December 2008 (UTC)

$ \ L(x) = \frac{P_{\rm X|\theta} (x|\theta_1)}{P_{\rm X|\theta} (x|\theta_0)} $

Choose threshold (T),

$ \mbox{Say } \begin{cases} H_{1}; \mbox{ if } L(x) > T\\ H_{0}; \mbox{ if } L(x) < T \end{cases} $

The Maximum Likelihood rule is a Likelihood Ratio Test with T = 1

Observations:

  1. as T increases Type I Error Increases
  2. as T increases Type II Error Decreases
  3. as T decreases Type I Error Decreases
  4. as T decreases Type II Error Increases

Alumni Liaison

Abstract algebra continues the conceptual developments of linear algebra, on an even grander scale.

Dr. Paul Garrett