Details of Lecture 26, ECE662 Spring 2010
April 22, 2010
In Lecture 26, we finished our discussion of artificial neural networks. The "highlight" of the lecture was a somewhat involved (mostly because of the many indices) computation of the update rule for the gradient descent to find the least squares solution for the parameters of an ANN using an inline training approach. We got to appreciate the fact that the parameters to optimize occur within linear functions in the expression for the cost function, as well as the fact that 3 layers are typically sufficient for accurately approximating the k-class decision function. Othewise, the computations would have been much worse.
Recalls: we are taking a poll regarding your favorite decision method. Please make sure to answer before the end of the semester. (Hint: stars stars!!).
It was announced that next Thursday's lecture (4-29-10) is canceled.
Note that there is a make up class tomorrow Friday (4-23-10) in EE117.
Previous: Lecture 25 Next: Lecture 27