Line 18: Line 18:
 
| 2. What is pattern Recognition
 
| 2. What is pattern Recognition
 
|-
 
|-
| [[Lecture2ECE662S10|2]],[[Lecture3ECE662S10|3]]
+
| [[Lecture2ECE662S10|2]],[[Lecture3ECE662S10|3]]  
 
| 3. Finite vs Infinite feature spaces
 
| 3. Finite vs Infinite feature spaces
 
|-
 
|-
Line 50: Line 50:
  
 
|-
 
|-
| 19,20,[[Lecture21ECE662S10|21]], [[Lecture22ECE662S10|22]]
+
| 19,20,[[Lecture21ECE662S10|21]], [[Lecture22ECE662S10|22]]  
 
| 8. Linear Discriminants
 
| 8. Linear Discriminants
 
|-
 
|-
|  
+
| 22,23,24,25
 
|  
 
|  
 
9. Non-Linear Discriminant functions  
 
9. Non-Linear Discriminant functions  
  
 
*Support Vector Machines   
 
*Support Vector Machines   
*Artificial Neural Networks  
+
*Artificial Neural Networks
*Decision Trees
+
  
 
|-
 
|-
|  
+
| 26,27,28,29,30
| 10. Clustering
+
| 10. Clustering and decision trees
 
|}
 
|}
  

Revision as of 09:14, 13 April 2010


Course Outline, ECE662 Spring 2010 Prof. Mimi

Note: This is an approximate outline that is subject to change throughout the semester.


Lecture Topic
1 1. Introduction
1 2. What is pattern Recognition
2,3 3. Finite vs Infinite feature spaces
4,5 4. Bayes Rule
6-10

5. Discriminant functions

  • Definition;
  • Application to normally distributed features;
  • Error analysis.
11-13

6. Parametric Density Estimation

  • Maximum likelihood estimation
  • Bayesian parameter estimation
13-19

7. Non-parametric Density Estimation

  • Parzen Windows
  • K-nearest neighbors
  • The nearest neighbor classification rule.
19,20,21, 22 8. Linear Discriminants
22,23,24,25

9. Non-Linear Discriminant functions

  • Support Vector Machines 
  • Artificial Neural Networks
26,27,28,29,30 10. Clustering and decision trees



Back to 2010 Spring ECE 662 mboutin

Alumni Liaison

BSEE 2004, current Ph.D. student researching signal and image processing.

Landis Huffman