Line 1: Line 1:
==Relevant Rhea Pages==
+
[[Category:ECE662Spring2012Boutin]]
 +
[[Category:blog]]
 +
[[Category:density estimation]]
 +
[[Category:decision theory]]
 +
[[Category:pattern recognition]]
 +
[[Category:k nearest neighbors]]
 +
 
 +
=Lecture 19 Blog, [[ECE662]] Spring 2012, [[user:mboutin|Prof. Boutin]]=
 +
Tuesday March 20, 2012 (Week 10)
 +
----
 +
Quick link to lecture blogs: [[Lecture1ECE662S12|1]]|[[Lecture2ECE662S12|2]]|[[Lecture3ECE662S12|3]]|[[Lecture4ECE662S12|4]]|[[Lecture5ECE662S12|5]]|[[Lecture6ECE662S12|6]]|[[Lecture7ECE662S12|7]]|[[Lecture8ECE662S12|8]]| [[Lecture9ECE662S12|9]]|[[Lecture10ECE662S12|10]]|[[Lecture11ECE662S12|11]]|[[Lecture12ECE662S12|12]]|[[Lecture13ECE662S12|13]]|[[Lecture14ECE662S12|14]]|[[Lecture15ECE662S12|15]]|[[Lecture16ECE662S12|16]]|[[Lecture17ECE662S12|17]]|[[Lecture18ECE662S12|18]]|[[Lecture19ECE662S12|19]]|[[Lecture20ECE662S12|20]]|[[Lecture21ECE662S12|21]]|[[Lecture22ECE662S12|22]]|[[Lecture23ECE662S12|23]]|[[Lecture24ECE662S12|24]]|[[Lecture25ECE662S12|25]]|[[Lecture26ECE662S12|26]]|[[Lecture27ECE662S12|27]]|[[Lecture28ECE662S12|28]]|[[Lecture29ECE662S12|29]]|[[Lecture30ECE662S12|30]]
 +
----
 +
Coming back from Spring break, today we covered the k-nearest neighbor (KNN) density estimation technique, along with the k-nearest neighbor (KNN) pattern recognition method. More specifically, we presented a formula for estimating a density function at a point based on some samples drawn from that density, and we showed that it is an unbiased estimate of the true value of the density at that point. We also showed how this formula is the basis for using the "majority vote among neighbors" rule for assigning a category to data.
 +
 
 +
==Related Rhea Pages==
 
*[[KNN-K_Nearest_Neighbor_OldKiwi|A page about K-Nearest Neighbor from ECE662 Spring 2008]]
 
*[[KNN-K_Nearest_Neighbor_OldKiwi|A page about K-Nearest Neighbor from ECE662 Spring 2008]]
 
*[[KNN_algorithm_OldKiwi|A KNN tutorial, from ECE662 Spring 2008]]
 
*[[KNN_algorithm_OldKiwi|A KNN tutorial, from ECE662 Spring 2008]]
 +
 +
 +
Previous: [[Lecture18ECE662S12|Lecture 18]]
 +
 +
Next: [[Lecture20ECE662S12|Lecture 20]]
 +
----
 +
==Comments==
 +
Please write your comments and questions below.
 +
*Write a comment here
 +
*Write another comment here.
 +
----
 +
[[2012_Spring_ECE_662_Boutin|Back to ECE662 Spring 2012]]

Latest revision as of 09:34, 22 March 2012


Lecture 19 Blog, ECE662 Spring 2012, Prof. Boutin

Tuesday March 20, 2012 (Week 10)


Quick link to lecture blogs: 1|2|3|4|5|6|7|8| 9|10|11|12|13|14|15|16|17|18|19|20|21|22|23|24|25|26|27|28|29|30


Coming back from Spring break, today we covered the k-nearest neighbor (KNN) density estimation technique, along with the k-nearest neighbor (KNN) pattern recognition method. More specifically, we presented a formula for estimating a density function at a point based on some samples drawn from that density, and we showed that it is an unbiased estimate of the true value of the density at that point. We also showed how this formula is the basis for using the "majority vote among neighbors" rule for assigning a category to data.

Related Rhea Pages


Previous: Lecture 18

Next: Lecture 20


Comments

Please write your comments and questions below.

  • Write a comment here
  • Write another comment here.

Back to ECE662 Spring 2012

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett