Line 16: | Line 16: | ||
*[[Hw2_ECE662_S12|HW2 ECE662 Spring 2012]] | *[[Hw2_ECE662_S12|HW2 ECE662 Spring 2012]] | ||
*[[ECE662_hw2_discussions|HW2 discussion from ECE662 Spring 2010]] | *[[ECE662_hw2_discussions|HW2 discussion from ECE662 Spring 2010]] | ||
− | + | *[[Lecture_16_-_Parzen_Window_Method_and_K-nearest_Neighbor_Density_Estimate_Old_Kiwi|Lecture 16, ECE662, Spring 2008]] | |
Previous: [[Lecture17ECE662S12|Lecture 17]] | Previous: [[Lecture17ECE662S12|Lecture 17]] |
Revision as of 12:38, 8 March 2012
Lecture 18 Blog, ECE662 Spring 2012, Prof. Boutin
Thursday March 8, 2012 (Week 9)
Quick link to lecture blogs: 1|2|3|4|5|6|7|8| 9|10|11|12|13|14|15|16|17|18|19|20|21|22|23|24|25|26|27|28|29|30
Today we finished discussing the the Parzen window method for estimating the probability density function at a point x of the feature space using samples. In particular, we discussed how, in the context of a decision problem, this technique boils down to a majority vote among the neighboring samples. However, it was pointed out that using different window volumes for different classes might improve the result of this voting procedure. Also, en interesting connection to the sampling theorem was pointed out by a student.
Note that the second homework assignment is not posted. See you after Spring break!
Related Rhea Pages
Previous: Lecture 17
Next: Lecture 19
Comments
Please write your comments and questions below.
- Write a comment here
- Write another comment here.