(15 intermediate revisions by the same user not shown)
Line 1: Line 1:
[[Category:2010 Spring ECE 662 mboutin]]
+
<br>
  
=outlineECE662S10=
+
= Course Outline, [[ECE662]] Spring 2010 [[User:Mboutin|Prof. Mimi]]  =
  
 +
Note: This is an approximate outline that is subject to change throughout the semester.
  
 +
<br>
  
Put your content here . . .
+
{| width="55%" cellspacing="1" cellpadding="1" border="1"
 +
|-
 +
! scope="col" | Lecture
 +
! scope="col" | Topic
 +
|-
 +
| [[Lecture1ECE662S10|1]]
 +
| 1. Introduction
 +
|-
 +
| [[Lecture1ECE662S10|1]]
 +
| 2. What is pattern Recognition
 +
|-
 +
| [[Lecture2ECE662S10|2]],[[Lecture3ECE662S10|3]]
 +
| 3. Finite vs Infinite feature spaces
 +
|-
 +
| [[Lecture4ECE662S10|4]],[[Lecture5ECE662S10|5]]
 +
| 4. Bayes Rule
 +
|-
 +
| [[Lecture6ECE662S10|6]]-10
 +
|
 +
5. Discriminant functions
  
 +
*Definition;
 +
*Application to normally distributed features;
 +
*Error analysis.
  
 +
|-
 +
| [[Lecture11ECE662S10|11]],12,13
 +
|
 +
6. Parametric Density Estimation
  
 +
*Maximum likelihood estimation
 +
*Bayesian parameter estimation
  
[[ 2010 Spring ECE 662 mboutin|Back to 2010 Spring ECE 662 mboutin]]
+
|-
 +
| 13-19
 +
|
 +
7. Non-parametric Density Estimation
 +
 
 +
*Parzen Windows
 +
*K-nearest neighbors
 +
*The nearest neighbor classification rule.
 +
 
 +
|-
 +
| 19,20,[[Lecture21ECE662S10|21]], [[Lecture22ECE662S10|22]]
 +
| 8. Linear Discriminants
 +
|-
 +
| [[Lecture22ECE662S10|22]], [[Lecture23ECE662S10|23]] ,[[Lecture24ECE662S10|24]],[[Lecture25ECE662S10|25]],[[Lecture26ECE662S10|26]]
 +
|
 +
9. Non-Linear Discriminant functions
 +
 
 +
*Support Vector Machines&nbsp;
 +
*Artificial Neural Networks
 +
 
 +
|-
 +
| 27,28,29,30
 +
| 10. Clustering and decision trees
 +
|}
 +
 
 +
<br>
 +
 
 +
----
 +
 
 +
[[2010 Spring ECE 662 mboutin|Back to 2010 Spring ECE 662 mboutin]]
 +
 
 +
[[Category:2010_Spring_ECE_662_mboutin]]

Latest revision as of 07:55, 22 April 2010


Course Outline, ECE662 Spring 2010 Prof. Mimi

Note: This is an approximate outline that is subject to change throughout the semester.


Lecture Topic
1 1. Introduction
1 2. What is pattern Recognition
2,3 3. Finite vs Infinite feature spaces
4,5 4. Bayes Rule
6-10

5. Discriminant functions

  • Definition;
  • Application to normally distributed features;
  • Error analysis.
11,12,13

6. Parametric Density Estimation

  • Maximum likelihood estimation
  • Bayesian parameter estimation
13-19

7. Non-parametric Density Estimation

  • Parzen Windows
  • K-nearest neighbors
  • The nearest neighbor classification rule.
19,20,21, 22 8. Linear Discriminants
22, 23 ,24,25,26

9. Non-Linear Discriminant functions

  • Support Vector Machines 
  • Artificial Neural Networks
27,28,29,30 10. Clustering and decision trees



Back to 2010 Spring ECE 662 mboutin

Alumni Liaison

Recent Math PhD now doing a post-doctorate at UC Riverside.

Kuei-Nuan Lin