Details of Lecture 23, ECE662 Spring 2010
April 15, 2010
In Lecture 23, we continued discussing Support Vector Machines. We presented an example of a non-linear separation in the feature space that becomes linear in an extended feature space defined by monomials. We pointed out the difficulty of working in the typically large dimensional spaces created when extending the feature space to the monomial space. Then we noted that this would later be addressed by the "kernel trick". We then began presenting the formulation of a minimization problem involving only the support vectors which can be used to find a linear separation between training samples from two classes.
Meanwhile, there is an interesting discussion going on on Rhea about Bayes rate under severe class imbalance. Please join in!
Note: Homework 2 is due today. Please hand in your instructor's dropbox (in the "homework2" box if you see it).
Do not forget that the next lecture will be tomorrow Friday April 16, 1:30-2:30 in EE117.
Previous: Lecture 22 Next: Lecture 24