(One intermediate revision by the same user not shown) | |||
Line 2: | Line 2: | ||
=Details of Lecture 8, [[ECE662]] Spring 2010= | =Details of Lecture 8, [[ECE662]] Spring 2010= | ||
+ | In Lecture 8, we justified the commonly made assumption that the features are normally distributed with the Central Limit Theorem. We then discussed the probability of error when using Bayes decision rule. More precisely, we obtained the Chernoff Bound and the Bhattacharrya bound for the probability of error. | ||
+ | Note for this lecture can be found [[noteslecture8ECE662S10|here]]. | ||
+ | |||
− | + | Previous: [[Lecture7ECE662S10|Lecture 7]] | |
+ | Next: [[Lecture9ECE662S10|Lecture 9]] | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
---- | ---- | ||
− | |||
− | |||
[[ 2010 Spring ECE 662 mboutin|Back to 2010 Spring ECE 662 mboutin]] | [[ 2010 Spring ECE 662 mboutin|Back to 2010 Spring ECE 662 mboutin]] | ||
− | |||
− |
Latest revision as of 08:09, 11 May 2010
Details of Lecture 8, ECE662 Spring 2010
In Lecture 8, we justified the commonly made assumption that the features are normally distributed with the Central Limit Theorem. We then discussed the probability of error when using Bayes decision rule. More precisely, we obtained the Chernoff Bound and the Bhattacharrya bound for the probability of error.
Note for this lecture can be found here.
Previous: Lecture 7
Next: Lecture 9