(New page: Category:2010 Spring ECE 662 mboutin =Details of Lecture 22, ECE662 Spring 2010= In Lecture 22, we will continue our discussion of Fisher's linear discriminant. We will begin by ...) |
|||
Line 2: | Line 2: | ||
=Details of Lecture 22, [[ECE662]] Spring 2010= | =Details of Lecture 22, [[ECE662]] Spring 2010= | ||
+ | In Lecture 22, we continued our discussion of Fisher's linear discriminant. We began by answering the question: why not use | ||
− | + | <math>J(\vec{w})=\frac{\| \tilde{m}_1-\tilde{m}_2\|^2}{\|\vec{w} \|^2}</math> | |
instead of | instead of | ||
− | <math> J(\ | + | <math> J(\vec{w})=\frac{\| \tilde{m}_1-\tilde{m}_2 \|^2}{\tilde{s}_1^2+\tilde{s}_2^2}</math> |
? | ? | ||
+ | We then presented the analytic expression for <math>\vec{w}_0</math>, the argmax of <math>J(\vec{w})</math>, and related <math>\vec{w}_0</math> to the least square solution of <math>Y \vec{c}=b</math>. Finally, we began Section 9 of the course on Support Vector Machines by introducing the idea of extending the feature vector space into a space spanned by monomials. | ||
+ | |||
+ | ==Useful Links== | ||
For more info, you may look at these students' pages on Fisher's linear discriminant: | For more info, you may look at these students' pages on Fisher's linear discriminant: | ||
* [[Derivation_of_Fisher's_Linear_Discriminant_OldKiwi|Definition Fisher's linear discriminant]], | * [[Derivation_of_Fisher's_Linear_Discriminant_OldKiwi|Definition Fisher's linear discriminant]], | ||
* [[Fisher_Linear_Discriminant_OldKiwi| Fisher's linear discriminant in brief]] | * [[Fisher_Linear_Discriminant_OldKiwi| Fisher's linear discriminant in brief]] | ||
+ | |||
Previous: [[Lecture21ECE662S10|Lecture 21]] | Previous: [[Lecture21ECE662S10|Lecture 21]] |
Revision as of 09:10, 13 April 2010
Details of Lecture 22, ECE662 Spring 2010
In Lecture 22, we continued our discussion of Fisher's linear discriminant. We began by answering the question: why not use
$ J(\vec{w})=\frac{\| \tilde{m}_1-\tilde{m}_2\|^2}{\|\vec{w} \|^2} $ instead of $ J(\vec{w})=\frac{\| \tilde{m}_1-\tilde{m}_2 \|^2}{\tilde{s}_1^2+\tilde{s}_2^2} $ ?
We then presented the analytic expression for $ \vec{w}_0 $, the argmax of $ J(\vec{w}) $, and related $ \vec{w}_0 $ to the least square solution of $ Y \vec{c}=b $. Finally, we began Section 9 of the course on Support Vector Machines by introducing the idea of extending the feature vector space into a space spanned by monomials.
Useful Links
For more info, you may look at these students' pages on Fisher's linear discriminant:
Previous: Lecture 21
Next: Lecture 23