m
m
 
(2 intermediate revisions by 2 users not shown)
Line 3: Line 3:
 
Lectures discussing Bayesian Decision Theory : [[Lecture 3_OldKiwi]] and [[Lecture 4_OldKiwi]]
 
Lectures discussing Bayesian Decision Theory : [[Lecture 3_OldKiwi]] and [[Lecture 4_OldKiwi]]
  
Example : [[Bayes Example_OldKiwi]]
+
[[Bayes Example_OldKiwi|Example]]
  
 
== References ==
 
== References ==
 +
Some nice tutorials and documents on Bayes decision theory can be found on:
 +
*http://www.cse.unr.edu/~bebis/CS679/Handouts/DHS2.11Revised.pdf
 +
*http://rii.ricoh.com/~stork/DHSch2part3.ppt
 +
*http://www.ee.technion.ac.il/courses/048995/DecisionTheory.pdf
 +
*http://en.wikipedia.org/wiki/Probability_distribution
 +
*http://en.wikipedia.org/wiki/Bayesian_inference
  
<b>Some nice tutorials and documents on Bayes decision theory can be found on:</b>
+
[[Category:ECE662]]
<ul><li> http://www.cse.unr.edu/~bebis/CS679/Handouts/DHS2.11Revised.pdf</li>
+
<li> http://rii.ricoh.com/~stork/DHSch2part3.ppt</li>
+
<li> http://www.ee.technion.ac.il/courses/048995/DecisionTheory.pdf</li></ul>
+
  
 +
== The Neyman Pearson Framework ==
  
<ul><li>http://en.wikipedia.org/wiki/Probability_distribution</li>
+
The Bayesian approach to classification deals with minimizing an expected cost. Hence one
<li>http://en.wikipedia.org/wiki/Bayesian_inference</li></ul>
+
need to assign a-priori probabilities. Situations exist in pattern classification where it is not
 +
reasonable to assign a-priori probabilities or even to assign costs. For example it is very difficult to assign a-priori probabilities to events that are highly unlikely. In these cases where the Bayesian approach is not the most reasonable framework, one often use the Neyman Pearson approach where we maximize the probability of detection (one type of error) while constraining the probability of false-alarm to be less than or equal to a certain value. For example using a radar, it might be ok to have a false alarm (assuming an enemy aircraft is approaching while this is not the case) but it is very important to maximize the probability of detecting a real attack. In this case, it is not always reasonable to assign a-priori probabilities therefore, the Neyman Pearson framework to classification is more appropriate.
  
[[Category:ECE662]]
+
== A Bayes Classifier Example ==
 +
[[Image:figure13_OldKiwi.bmp]]
 +
In the above figure, the objects can be classified as either GREEN or RED. Our job is to give class labels to new cases as they arrive, using the bayesian approach.
 +
 
 +
===Prior Probabilities===
 +
Since there are twice as many GREEN objects as RED, it is reasonable to believe that a new case is twice as likely to have membership GREEN rather than RED.
 +
Prior Probability for GREEN = 20/30
 +
Prior Probability for RED = 10/30
 +
[[Image:figure12_OldKiwi.bmp]]
 +
Now, we classify a new object which is shown by a white circle in the above figure.
 +
===Likelyhood===
 +
To calculate the likelihood, we can look at the number of points of each color in the vicinity of the new object. For this purpose we draw a circle around the new object, which consists a number (chosen apriori) of points irrespective of their class labels. We can calculate the likelihood by taking the ratio of points of each color, to that of the total number of points of that color.
 +
Likelihood of X to be GREEN = 1/20
 +
Likelihood of X to be RED = 2/10
 +
According to PRIOR probabilities, X should belong to GREEN; but LIKELIHOOD indicates that its class should be RED, because there are more RED in its vicinity than GREEN.
 +
The Bayesian Classifier makes the final decision using a combination of both PRIOR PROBABILITY and LIKELIHOOD by forming a POSTERIOR probability using Bayes Rule.
 +
Posterior Probability of X to be GREEN is proportional to (Prior probability for GREEN) * (Likelihood of X to be GREEN) = 20/30*1/20=1/30
 +
Posterior Probability of X to be RED is proportional to (Prior probability for RED) * (Likelihood of X to be RED=10/30*2/10=2/30
 +
Finally, X is classified as RED because of the higher posterior probability of X to be RED.
 +
REFERENCE: Statistics : Methods and References by Pawel Lewicki and Thomas Hill

Latest revision as of 16:03, 16 April 2008

This page and its subtopics discusses everything about Bayesian Decision Theory.

Lectures discussing Bayesian Decision Theory : Lecture 3_OldKiwi and Lecture 4_OldKiwi

Example

References

Some nice tutorials and documents on Bayes decision theory can be found on:

The Neyman Pearson Framework

The Bayesian approach to classification deals with minimizing an expected cost. Hence one need to assign a-priori probabilities. Situations exist in pattern classification where it is not reasonable to assign a-priori probabilities or even to assign costs. For example it is very difficult to assign a-priori probabilities to events that are highly unlikely. In these cases where the Bayesian approach is not the most reasonable framework, one often use the Neyman Pearson approach where we maximize the probability of detection (one type of error) while constraining the probability of false-alarm to be less than or equal to a certain value. For example using a radar, it might be ok to have a false alarm (assuming an enemy aircraft is approaching while this is not the case) but it is very important to maximize the probability of detecting a real attack. In this case, it is not always reasonable to assign a-priori probabilities therefore, the Neyman Pearson framework to classification is more appropriate.

A Bayes Classifier Example

File:Figure13 OldKiwi.bmp In the above figure, the objects can be classified as either GREEN or RED. Our job is to give class labels to new cases as they arrive, using the bayesian approach.

Prior Probabilities

Since there are twice as many GREEN objects as RED, it is reasonable to believe that a new case is twice as likely to have membership GREEN rather than RED. Prior Probability for GREEN = 20/30 Prior Probability for RED = 10/30 File:Figure12 OldKiwi.bmp Now, we classify a new object which is shown by a white circle in the above figure.

Likelyhood

To calculate the likelihood, we can look at the number of points of each color in the vicinity of the new object. For this purpose we draw a circle around the new object, which consists a number (chosen apriori) of points irrespective of their class labels. We can calculate the likelihood by taking the ratio of points of each color, to that of the total number of points of that color. Likelihood of X to be GREEN = 1/20 Likelihood of X to be RED = 2/10 According to PRIOR probabilities, X should belong to GREEN; but LIKELIHOOD indicates that its class should be RED, because there are more RED in its vicinity than GREEN. The Bayesian Classifier makes the final decision using a combination of both PRIOR PROBABILITY and LIKELIHOOD by forming a POSTERIOR probability using Bayes Rule. Posterior Probability of X to be GREEN is proportional to (Prior probability for GREEN) * (Likelihood of X to be GREEN) = 20/30*1/20=1/30 Posterior Probability of X to be RED is proportional to (Prior probability for RED) * (Likelihood of X to be RED=10/30*2/10=2/30 Finally, X is classified as RED because of the higher posterior probability of X to be RED. REFERENCE: Statistics : Methods and References by Pawel Lewicki and Thomas Hill

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood