Line 11: Line 11:
 
g_i(X) 是X 来自 omega_i 的 Posterior probability   
 
g_i(X) 是X 来自 omega_i 的 Posterior probability   
 
Decision rule: 如果 个g_1(X) > g_2(X),就选 omega1, 不然选 omega 2
 
Decision rule: 如果 个g_1(X) > g_2(X),就选 omega1, 不然选 omega 2
据Bayes theorem, decision rule 可以用likelihood ratio 表示  
+
据Bayes theorem, decision rule 可以用likelihood ratio 表示 <math> l(X) </math>
  
 
<math>\begin{align}
 
<math>\begin{align}
Line 17: Line 17:
 
\Rightarrow & P(\omega_1|X) > P(\omega_2|X) \\
 
\Rightarrow & P(\omega_1|X) > P(\omega_2|X) \\
 
\Rightarrow &  \frac{P(X|\omega_1)P(\omega_1)}{P(X)} > \frac{P(X|\omega_2)P(\omega_2)}{P(X)} \\
 
\Rightarrow &  \frac{P(X|\omega_1)P(\omega_1)}{P(X)} > \frac{P(X|\omega_2)P(\omega_2)}{P(X)} \\
\Rightarrow & P(X|\omega_1)P(\omega_1) > P(X|\omega_2)P(\omega_2)
+
\Rightarrow & P(X|\omega_1)P(\omega_1) > P(X|\omega_2)P(\omega_2) \\
 +
\Rightarrow & l(X)=\frac{P(X|\omega_1)}{P(X|\omega_2)} > \frac{P(\omega_2)}{P(\omega_1)} = k
 
\end{align}
 
\end{align}
 
 
</math>
 
</math>
 +
 +
 +
 
Neyman -- Pearson Test
 
Neyman -- Pearson Test

Revision as of 14:14, 1 May 2014

Hypothesis Testing

PR 的目标是将新的sample进行分类。 分类的决定通过 假设sample是rv, 其conditional density来自其类别 如果知道conditional density, pr的问题就变成statistical hyp testing 的问题 如下假设sample属于两个class其中一个、知道conditional density 和 prior

Bayes Decision Rule for Minum Error 假设X是个observation vector. g_i(X) 是X 来自 omega_i 的 Posterior probability Decision rule: 如果 个g_1(X) > g_2(X),就选 omega1, 不然选 omega 2 据Bayes theorem, decision rule 可以用likelihood ratio 表示 $ l(X) $

$ \begin{align} & g_1(X) > g_2(X) \\ \Rightarrow & P(\omega_1|X) > P(\omega_2|X) \\ \Rightarrow & \frac{P(X|\omega_1)P(\omega_1)}{P(X)} > \frac{P(X|\omega_2)P(\omega_2)}{P(X)} \\ \Rightarrow & P(X|\omega_1)P(\omega_1) > P(X|\omega_2)P(\omega_2) \\ \Rightarrow & l(X)=\frac{P(X|\omega_1)}{P(X|\omega_2)} > \frac{P(\omega_2)}{P(\omega_1)} = k \end{align} $


Neyman -- Pearson Test

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn