Revision as of 12:33, 27 January 2012 by Mboutin (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)


Lecture 6 Blog, ECE662 Spring 2012, Prof. Boutin

Thursday January 24, 2012 (Week 3)


Today we began talking about an important subject in decision theory: Bayes rule for normally distributed feature vectors. We proposed a simple discriminant function for this special case, and noted its geometric meaning. To better understand this geometric meaning, we first considered the special case where the class density has the identity matrix as standard deviation matrix. We noticed in that case that the value of the discriminant function is constant along circles around the mean of the class density, and that the closer the feature vector to the mean of the class density (in the usual, Euclidean sense), the larger the value of the discriminant function $ g_i(x) $ for that class.

We also spent a lot of time discussing the first homework. Hopefully you are all beginning to think about possible questions to investigate and how you are going to attack these.

Previous: Lecture 5

Next: Lecture 7


Comments

Please write your comments and questions below.

  • Write a comment here
  • Write another comment here.

Back to ECE662 Spring 2012

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood