Line 7: Line 7:
 
<br>  
 
<br>  
  
{| width="55%" border="1" cellpadding="1" cellspacing="1"
+
{| width="55%" cellspacing="1" cellpadding="1" border="1"
 
|-
 
|-
! scope="col" | Lecture
+
! scope="col" | Lecture  
 
! scope="col" | Topic
 
! scope="col" | Topic
 
|-
 
|-
| 1
+
| 1  
 
| 1. Introduction
 
| 1. Introduction
 
|-
 
|-
| 1
+
| 1  
 
| 2. What is pattern Recognition
 
| 2. What is pattern Recognition
 
|-
 
|-
| 2-3
+
| 2-3  
 
| 3. Finite vs Infinite feature spaces
 
| 3. Finite vs Infinite feature spaces
 
|-
 
|-
| 4-5
+
| 4-5  
 
| 4. Bayes Rule
 
| 4. Bayes Rule
 
|-
 
|-
| 6-10
+
| 6-10  
 
|  
 
|  
5. Discriminate functions
+
5. Discriminant functions  
  
- Definition;
+
*Definition;  
 
+
*Application to normally distributed features;  
- Application to normally distributed features;
+
*Error analysis.
 
+
- Error analysis.
+
  
 
|-
 
|-
| 11-12
+
| 11-12  
 
|  
 
|  
6. Parametric Density Estimation
+
6. Parametric Density Estimation  
  
-Maximum likelihood estimation
+
*Maximum likelihood estimation  
 
+
*Bayesian parameter estimation
-Bayesian parameter estimation
+
  
 
|-
 
|-
 
|  
 
|  
 
|  
 
|  
7. Non-parametric Density Estimation
+
7. Non-parametric Density Estimation  
  
-Parzen Windows
+
*Parzen Windows  
 
+
*K-nearest neighbors  
-K-nearest neighbors
+
*The nearest neighbor classification rule.
 
+
-The nearest neighbor classification rule.
+
  
 
|-
 
|-
Line 59: Line 54:
 
|-
 
|-
 
|  
 
|  
| 9. SVM
 
|-
 
 
|  
 
|  
| 10. ANN
+
9. More on Non-Linear Discriminant functions
|-
+
 
|
+
*Support Vector Machines&nbsp;
| 11. Decision Trees
+
*Artificial Neural Networks
 +
*Decision Trees
 +
 
 
|-
 
|-
 
|  
 
|  
| 12. Clustering
+
| 10. Clustering
 
|}
 
|}
  

Revision as of 08:39, 9 March 2010


Course Outline, ECE662 Spring 2010 Prof. Mimi

Note: This is an approximate outline that is subject to change throughout the semester.


Lecture Topic
1 1. Introduction
1 2. What is pattern Recognition
2-3 3. Finite vs Infinite feature spaces
4-5 4. Bayes Rule
6-10

5. Discriminant functions

  • Definition;
  • Application to normally distributed features;
  • Error analysis.
11-12

6. Parametric Density Estimation

  • Maximum likelihood estimation
  • Bayesian parameter estimation

7. Non-parametric Density Estimation

  • Parzen Windows
  • K-nearest neighbors
  • The nearest neighbor classification rule.
8. Linear Discriminants

9. More on Non-Linear Discriminant functions

  • Support Vector Machines 
  • Artificial Neural Networks
  • Decision Trees
10. Clustering



Back to 2010 Spring ECE 662 mboutin

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett