Line 17: | Line 17: | ||
*[[Lecture_18_-_Nearest_Neighbors_Clarification_Rule_and_Metrics(Continued)_OldKiwi|Lecture 18 from ECE662 Spring 2008]] | *[[Lecture_18_-_Nearest_Neighbors_Clarification_Rule_and_Metrics(Continued)_OldKiwi|Lecture 18 from ECE662 Spring 2008]] | ||
*[[Lecture_19_-_Nearest_Neighbor_Error_Rates_OldKiwi|Lecture 19 from ECE662 Spring 2008]] | *[[Lecture_19_-_Nearest_Neighbor_Error_Rates_OldKiwi|Lecture 19 from ECE662 Spring 2008]] | ||
− | *[[ | + | *[[HW3_KNNandNN_comp_zge|discussion: is KNN the same as NN when k=1?]] |
Revision as of 09:45, 22 March 2012
Lecture 20 Blog, ECE662 Spring 2012, Prof. Boutin
Wednesday March 22, 2012 (Week 10)
Quick link to lecture blogs: 1|2|3|4|5|6|7|8| 9|10|11|12|13|14|15|16|17|18|19|20|21|22|23|24|25|26|27|28|29|30
Today we talked about the nearest neighbor decision rule. We pointed out that this rule can obtained as a special case of a (biased estimate) formula for estimating the mixture density at a point x using the nearest neighbor among a set of labeled samples drawn from the mixture density.
Related Rhea Pages
- Lecture 17 from ECE662 Spring 2008
- Lecture 18 from ECE662 Spring 2008
- Lecture 19 from ECE662 Spring 2008
- discussion: is KNN the same as NN when k=1?
Previous: Lecture 19
Next: Lecture 21
Comments
Please write your comments and questions below.
- Write a comment here
- Write another comment here.