Line 6: Line 6:
 
In Lecture 23, we continued discussing [[Support_Vector_Machines|Support Vector Machines]]. We presented an example of a non-linear separation in the feature space that becomes linear in an extended feature space defined by monomials. We pointed out the difficulty of working in the typically large dimensional spaces created when extending the feature space to the monomial space. Then we noted that this would later be addressed by the "kernel trick".  We then began presenting the formulation of a minimization problem involving only the support vectors which can be used to find a linear separation between training samples from two classes.
 
In Lecture 23, we continued discussing [[Support_Vector_Machines|Support Vector Machines]]. We presented an example of a non-linear separation in the feature space that becomes linear in an extended feature space defined by monomials. We pointed out the difficulty of working in the typically large dimensional spaces created when extending the feature space to the monomial space. Then we noted that this would later be addressed by the "kernel trick".  We then began presenting the formulation of a minimization problem involving only the support vectors which can be used to find a linear separation between training samples from two classes.
  
Meanwhile, there is an interesting discussion going on on Rhea about [[Bayes Rate Fallacy: Bayes Rules under Severe Class Imbalance‎|Bayes rate under severe class imbalance]]. Please join in!
+
Meanwhile, there is an interesting discussion going on on Rhea about [[Bayes Rate Fallacy: Bayes Rules under Severe Class Imbalance‎|Bayes rule under severe class imbalance]]. Please join in!
  
 
Note: [[Hw2_ECE662Spring2010|Homework 2]] is due today. Please hand in your instructor's dropbox (in the "homework2" box if you see it).
 
Note: [[Hw2_ECE662Spring2010|Homework 2]] is due today. Please hand in your instructor's dropbox (in the "homework2" box if you see it).

Latest revision as of 08:20, 15 April 2010


Details of Lecture 23, ECE662 Spring 2010

April 15, 2010

In Lecture 23, we continued discussing Support Vector Machines. We presented an example of a non-linear separation in the feature space that becomes linear in an extended feature space defined by monomials. We pointed out the difficulty of working in the typically large dimensional spaces created when extending the feature space to the monomial space. Then we noted that this would later be addressed by the "kernel trick". We then began presenting the formulation of a minimization problem involving only the support vectors which can be used to find a linear separation between training samples from two classes.

Meanwhile, there is an interesting discussion going on on Rhea about Bayes rule under severe class imbalance. Please join in!

Note: Homework 2 is due today. Please hand in your instructor's dropbox (in the "homework2" box if you see it).

Do not forget that the next lecture will be tomorrow Friday April 16, 1:30-2:30 in EE117.

Previous: Lecture 22 Next: Lecture 24


Back to course outline

Back to 2010 Spring ECE 662 mboutin

Back to ECE662

Alumni Liaison

Prof. Math. Ohio State and Associate Dean
Outstanding Alumnus Purdue Math 2008

Jeff McNeal