(14 intermediate revisions by 3 users not shown)
Line 1: Line 1:
<object width="425" height="355"><param name="movie" value="http://www.youtube.com/v/wzJkaATyitA&rel=1"></param><param name="wmode" value="transparent"></param><embed src="http://www.youtube.com/v/wzJkaATyitA&rel=1" type="application/x-shockwave-flash" wmode="transparent" width="425" height="355"></embed></object>
+
[[Category:ECE662]]
 +
[[Category:decision theory]]
 +
[[Category:lecture notes]]
 +
[[Category:pattern recognition]]
 +
[[Category:slecture]]
  
<object width="425" height="355"><param name="movie" value="http://www.youtube.com/v/wzJkaATyitA&hl=en"></param><param name="wmode" value="transparent"></param><embed src="http://www.youtube.com/v/wzJkaATyitA&hl=en" type="application/x-shockwave-flash" wmode="transparent" width="425" height="355"></embed></object>
+
=Bayes Decision Rule Video=
 +
for [[Lecture_6_-_Discriminant_Functions_OldKiwi|Lecture 6, ECE662, Spring 2010]]
 +
----
 +
<youtube>wzJkaATyitA</youtube>
  
The video demonstrates Bayes decision rule on 2D feature data from two classes. We visualize the decision hypersurface as a red "wall" cutting through the bi-modal distribution of the data, and observe how it changes with the parameters of the Gaussian distributions for the two classes.  Note that if the covariance matrices and the priors of the classes are identical, then the decision surface cuts directly between the two modes.  If the prior of one class increases, the decision surface is "pushed away" from that mode, biasing the classifier in favour of the more likely class.
+
The video demonstrates Bayes decision rule on 2D feature data from two classes. We visualize the decision hyper surface as a red "wall" cutting through the bi-modal distribution of the data, and observe how it changes with the parameters of the Gaussian distributions for the two classes.  Note that if the covariance matrices and the priors of the classes are identical, then the decision surface cuts directly between the two modes.  If the prior of one class increases, the decision surface is "pushed away" from that mode, biasing the classifier in favor of the more likely class.
  
The code for making such a video is here:
+
The code for making such a video is [http://web.ics.purdue.edu/~huffmalm/BayesDecisionSurface.tar.gz available for download here]
<a href="BayesDecisionSurface.tar.gz">BayesDecisionSurface.tar.gz</a>
+
  
For more on Bayes' decision rule, see [Lecture 6]
+
For more on Bayes' decision rule, see [[Lecture_6_-_Discriminant_Functions_Old Kiwi|Lecture 6, ECE662, Spring 2008]].
 +
----
 +
[[ECE662:BoutinSpring08_OldKiwi|Back to ECE662 Spring 2008]]

Latest revision as of 10:08, 10 June 2013


Bayes Decision Rule Video

for Lecture 6, ECE662, Spring 2010


The video demonstrates Bayes decision rule on 2D feature data from two classes. We visualize the decision hyper surface as a red "wall" cutting through the bi-modal distribution of the data, and observe how it changes with the parameters of the Gaussian distributions for the two classes. Note that if the covariance matrices and the priors of the classes are identical, then the decision surface cuts directly between the two modes. If the prior of one class increases, the decision surface is "pushed away" from that mode, biasing the classifier in favor of the more likely class.

The code for making such a video is available for download here

For more on Bayes' decision rule, see Lecture 6, ECE662, Spring 2008.


Back to ECE662 Spring 2008

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn