Line 59: | Line 59: | ||
== Questions and comments== | == Questions and comments== | ||
− | If you have any questions, comments, etc. please post them On [[ | + | If you have any questions, comments, etc. please post them On [[Bayes_Rule_for_Minimizing_Risk_Questions_and_comment|this page]]. |
Revision as of 14:03, 12 April 2014
Bayes Error for Minimizing Risk
Contents
Introduction
Bayes rule for minimizing risk
Example 1: 1D features
Example 2: 2D features
Summary and Conclusions
In this lecture we have shown that the probability of error ($Prob \left[ Error \right] $) when using Bayes error, is upper bounded by the Chernoff Bound. Therefore,
for $ \beta \in \left[ 0, 1 \right] $.
When $ \beta =\frac{1}{2} $ then $ \varepsilon_{\frac{1}{2}} $ in known as the Bhattacharyya bound.
References
[1]. Duda, Richard O. and Hart, Peter E. and Stork, David G., "Pattern Classication (2nd Edition)," Wiley-Interscience, 2000.
[2]. Mireille Boutin, "ECE662: Statistical Pattern Recognition and Decision Making Processes," Purdue University, Spring 2014.
Questions and comments
If you have any questions, comments, etc. please post them On this page.