Revision as of 18:11, 31 March 2016 by Mboutin (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


Third Mini-Project, ECE662 Spring 2016

Due in class Friday April 15. Earlier submissions are welcome!

Note about late submissions: Late submissions will be accepted until 5pm Thursday April 21 in MSEE330. This is a hard deadline. NO EXCEPTION! Please try not to be late and hand in by the April 15 deadline.


Question

We have learned how to estimate linear classifiers. We have also learned how support vector machines can be used to estimate non-linear discriminant functions as if they were linear. Experiment with linear classifiers and support vector machines to classify data. When do they work well? When do they not work well? In what way is one method more advantageous than the other? How do these two methods compare with the previously studied MLE-based classification and parzen-window based classification?

Write a report to explain your experiments and summarize your findings. Make sure to include a cover page, introduction, numerical results (e.g., tables, graphs,…), and a conclusion. Attach a copy of your code.

Hand in two hard copies of your report. The first one should start with a standard cover page. For the second one, remove your name from the cover page. Place the anonymous submission below the submission with a name and hand in both together.

  • DO NOT PLAGIARIZE!!!!! You will get in serious trouble if you do so. Don't use figures from the internet, don't copy and paste other people's text, cite all your sources, etc. In doubt, ask your instructor.
  • As stated in class, you must implement the linear classifier yourself. (Do not use a linear classifier package; you must write your own code to find the normal vector to the hyperplane and the threshold.)

Back to ECE 662 Spring 2016

Alumni Liaison

Abstract algebra continues the conceptual developments of linear algebra, on an even grander scale.

Dr. Paul Garrett