(One intermediate revision by the same user not shown)
Line 2: Line 2:
  
 
=Details of Lecture 23, [[ECE662]] Spring 2010=
 
=Details of Lecture 23, [[ECE662]] Spring 2010=
In Lecture 23, we will continue discussing [[Support_Vector_Machines|Support Vector Machines]].
+
April 15, 2010
  
 +
In Lecture 23, we continued discussing [[Support_Vector_Machines|Support Vector Machines]]. We presented an example of a non-linear separation in the feature space that becomes linear in an extended feature space defined by monomials. We pointed out the difficulty of working in the typically large dimensional spaces created when extending the feature space to the monomial space. Then we noted that this would later be addressed by the "kernel trick".  We then began presenting the formulation of a minimization problem involving only the support vectors which can be used to find a linear separation between training samples from two classes.
  
 +
Meanwhile, there is an interesting discussion going on on Rhea about [[Bayes Rate Fallacy: Bayes Rules under Severe Class Imbalance‎|Bayes rule under severe class imbalance]]. Please join in!
  
 +
Note: [[Hw2_ECE662Spring2010|Homework 2]] is due today. Please hand in your instructor's dropbox (in the "homework2" box if you see it).
 +
 +
<span style="color:green"> Do not forget that the next lecture will be tomorrow Friday April 16, 1:30-2:30 in EE117. </span>
  
 
Previous: [[Lecture22ECE662S10|Lecture 22]]
 
Previous: [[Lecture22ECE662S10|Lecture 22]]

Latest revision as of 08:20, 15 April 2010


Details of Lecture 23, ECE662 Spring 2010

April 15, 2010

In Lecture 23, we continued discussing Support Vector Machines. We presented an example of a non-linear separation in the feature space that becomes linear in an extended feature space defined by monomials. We pointed out the difficulty of working in the typically large dimensional spaces created when extending the feature space to the monomial space. Then we noted that this would later be addressed by the "kernel trick". We then began presenting the formulation of a minimization problem involving only the support vectors which can be used to find a linear separation between training samples from two classes.

Meanwhile, there is an interesting discussion going on on Rhea about Bayes rule under severe class imbalance. Please join in!

Note: Homework 2 is due today. Please hand in your instructor's dropbox (in the "homework2" box if you see it).

Do not forget that the next lecture will be tomorrow Friday April 16, 1:30-2:30 in EE117.

Previous: Lecture 22 Next: Lecture 24


Back to course outline

Back to 2010 Spring ECE 662 mboutin

Back to ECE662

Alumni Liaison

Abstract algebra continues the conceptual developments of linear algebra, on an even grander scale.

Dr. Paul Garrett