Revision as of 00:09, 7 April 2008 by Arora6 (Talk)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Bay1 Old Kiwi.jpg

In the above figure, the objects can be classified as either GREEN or RED. Our job is to give class labels to new cases as they arrive, using the bayesian approach.

PRIOR PROBABILITIES: Since there are twice as many GREEN objects as RED, it is reasonable to believe that a new case is twice as likely to have membership GREEN rather than RED.

Prior Probability for GREEN = 20/30 Prior Probability for RED = 10/30

Bay2 Old Kiwi.jpg

Now, we classify a new object which is shown by a white circle in the above figure.

LIKELIHOOD: To calculate the likelihood, we can look at the number of points of each color in the vicinity of the new object. For this purpose we draw a circle around the new object, which consists a number (chosen apriori) of points irrespective of their class labels. We can calculate the likelihood by taking the ratio of points of each color, to that of the total number of points of that color.

Likelihood of X to be GREEN = 1/20 Likelihood of X to be RED = 2/10

According to PRIOR probabilities, X should belong to GREEN; but LIKELIHOOD indicates that its class should be RED, because there are more RED in its vicinity than GREEN.

The Bayesian Classifier makes the final decision using a combination of both PRIOR PROBABILITY and LIKELIHOOD by forming a POSTERIOR probability using Bayes Rule.

Posterior Probability of X to be GREEN is proportional to (Prior probability for GREEN) * (Likelihood of X to be GREEN) = 20/30*1/20=1/30

Posterior Probability of X to be RED is proportional to (Prior probability for RED) * (Likelihood of X to be RED) =10/30*2/10=2/30 Finally, X is classified as RED because of the higher posterior probability of X to be RED.


REFERENCE: Statistics : Methods and References by Pawel Lewicki and Thomas Hill

Alumni Liaison

Followed her dream after having raised her family.

Ruth Enoch, PhD Mathematics