(New page: ---- [https://www.projectrhea.org/rhea/index.php/2014_Spring_ECE_662_Boutin Back to ECE662, Spring 2014] <center><font size="4"></font> <font size="4">Questions and Comments for: '''[[S...)
 
 
(4 intermediate revisions by the same user not shown)
Line 18: Line 18:
 
----
 
----
 
==Review==
 
==Review==
 +
This slecture reviewed by Robert Ness.
 +
<br>
  
 +
The student gives an in depth introduction to SVM.  He pays specific attention to the concept of slack variables, the kernel trick, and extensions from binary to multilevel classification.  He also gives a concise breakdown of SVM in the language of optimization.
 +
 +
Strong points:
 +
 +
Breaks down SVM in terms of the optimization problem.  I expect most who use SVM in practice don't concern themselves to much with this, but going through the steps outlined here gives one a good intuition of SVM.  This is especially true for the kernel trick, when one sees it in the context of optimization.
 +
 +
He also extends his explanation to classifying multiple categories, which went above and beyond what was covered in the class.
 +
 +
Critiques:
 +
 +
At time 10:58, you state you need the condition that t*y > 0.  This will confuse people who have not seen it before .
 +
In discussion of slack variables, it would be useful to explain why one doesn't just extend to a larger dimension, until you get a linear seperation.
 +
 +
Time 50:00 discussion about choice of parameters could have been enhanced with mention of cross-validation, which is often how such parameters are chosen.
 
----
 
----
 
[[2014_Spring_ECE_662_Boutin|Back to ECE 662 2014 course wiki]]  
 
[[2014_Spring_ECE_662_Boutin|Back to ECE 662 2014 course wiki]]  
  
 
[[ECE662|Back to ECE 662 course page]]
 
[[ECE662|Back to ECE 662 course page]]

Latest revision as of 06:18, 8 May 2014


Back to ECE662, Spring 2014


Questions and Comments for: Support Vector Machine

A slecture by student Tao Jiang


Please leave me comment below if you have any questions, if you notice any errors or if you would like to discuss a topic further.



Question/comment


Review

This slecture reviewed by Robert Ness.

The student gives an in depth introduction to SVM. He pays specific attention to the concept of slack variables, the kernel trick, and extensions from binary to multilevel classification. He also gives a concise breakdown of SVM in the language of optimization.

Strong points:

Breaks down SVM in terms of the optimization problem. I expect most who use SVM in practice don't concern themselves to much with this, but going through the steps outlined here gives one a good intuition of SVM. This is especially true for the kernel trick, when one sees it in the context of optimization.

He also extends his explanation to classifying multiple categories, which went above and beyond what was covered in the class.

Critiques:

At time 10:58, you state you need the condition that t*y > 0. This will confuse people who have not seen it before . In discussion of slack variables, it would be useful to explain why one doesn't just extend to a larger dimension, until you get a linear seperation.

Time 50:00 discussion about choice of parameters could have been enhanced with mention of cross-validation, which is often how such parameters are chosen.


Back to ECE 662 2014 course wiki

Back to ECE 662 course page

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood