(12 intermediate revisions by the same user not shown)
Line 7: Line 7:
 
<br>  
 
<br>  
  
{| width="55%" border="1" cellpadding="1" cellspacing="1"
+
{| width="55%" cellspacing="1" cellpadding="1" border="1"
 
|-
 
|-
! scope="col" | Lecture
+
! scope="col" | Lecture  
 
! scope="col" | Topic
 
! scope="col" | Topic
 
|-
 
|-
| 1
+
| [[Lecture1ECE662S10|1]]
 
| 1. Introduction
 
| 1. Introduction
 
|-
 
|-
| 1
+
| [[Lecture1ECE662S10|1]]
 
| 2. What is pattern Recognition
 
| 2. What is pattern Recognition
 
|-
 
|-
| 2-3
+
| [[Lecture2ECE662S10|2]],[[Lecture3ECE662S10|3]]
 
| 3. Finite vs Infinite feature spaces
 
| 3. Finite vs Infinite feature spaces
 
|-
 
|-
| 4-5
+
| [[Lecture4ECE662S10|4]],[[Lecture5ECE662S10|5]]
 
| 4. Bayes Rule
 
| 4. Bayes Rule
 
|-
 
|-
| 6-10
+
| [[Lecture6ECE662S10|6]]-10  
 
|  
 
|  
5. Discriminate functions
+
5. Discriminant functions  
  
- Definition;
+
*Definition;  
 
+
*Application to normally distributed features;  
- Application to normally distributed features;
+
*Error analysis.
 
+
- Error analysis.
+
  
 
|-
 
|-
| 11-12
+
| [[Lecture11ECE662S10|11]],12,13
 
|  
 
|  
6. Parametric Density Estimation
+
6. Parametric Density Estimation  
  
-Maximum likelihood estimation
+
*Maximum likelihood estimation  
 
+
*Bayesian parameter estimation
-Bayesian parameter estimation
+
  
 
|-
 
|-
 +
| 13-19
 
|  
 
|  
|
+
7. Non-parametric Density Estimation  
7. Non-parametric Density Estimation
+
  
-Parzen Windows
+
*Parzen Windows  
 
+
*K-nearest neighbors  
-K-nearest neighbors
+
*The nearest neighbor classification rule.
 
+
-The nearest neighbor classification rule.
+
  
 
|-
 
|-
|  
+
| 19,20,[[Lecture21ECE662S10|21]], [[Lecture22ECE662S10|22]]
 
| 8. Linear Discriminants
 
| 8. Linear Discriminants
 
|-
 
|-
 +
| [[Lecture22ECE662S10|22]], [[Lecture23ECE662S10|23]] ,[[Lecture24ECE662S10|24]],[[Lecture25ECE662S10|25]],[[Lecture26ECE662S10|26]]
 
|  
 
|  
| 9. SVM
+
9. Non-Linear Discriminant functions
 +
 
 +
*Support Vector Machines&nbsp;
 +
*Artificial Neural Networks
 +
 
 
|-
 
|-
|  
+
| 27,28,29,30
| 10. ANN
+
| 10. Clustering and decision trees
|-
+
|
+
| 11. Decision Trees
+
|-
+
|
+
| 12. Clustering
+
 
|}
 
|}
  

Latest revision as of 07:55, 22 April 2010


Course Outline, ECE662 Spring 2010 Prof. Mimi

Note: This is an approximate outline that is subject to change throughout the semester.


Lecture Topic
1 1. Introduction
1 2. What is pattern Recognition
2,3 3. Finite vs Infinite feature spaces
4,5 4. Bayes Rule
6-10

5. Discriminant functions

  • Definition;
  • Application to normally distributed features;
  • Error analysis.
11,12,13

6. Parametric Density Estimation

  • Maximum likelihood estimation
  • Bayesian parameter estimation
13-19

7. Non-parametric Density Estimation

  • Parzen Windows
  • K-nearest neighbors
  • The nearest neighbor classification rule.
19,20,21, 22 8. Linear Discriminants
22, 23 ,24,25,26

9. Non-Linear Discriminant functions

  • Support Vector Machines 
  • Artificial Neural Networks
27,28,29,30 10. Clustering and decision trees



Back to 2010 Spring ECE 662 mboutin

Alumni Liaison

Questions/answers with a recent ECE grad

Ryne Rayburn