Revision as of 07:54, 9 May 2014 by Choe11 (Talk | contribs)

Comments of slecture: Convergence of the Maximum Likelihood (ML) Estimator over Multiple Trials

A slecture by CS student Spencer Carver

Partially based on the ECE662 Spring 2014 lecture material of Prof. Mireille Boutin.




This is the talk page for the slecture notes on Convergence of the Maximum Likelihood (ML) Estimator over Multiple Trials. Please leave me a comment below if you have any questions, or if you would like to discuss a topic.



Questions and Comments

A reviewed by Jeehyun Choe.

This slecture is about the convergence of the MLE (Maximum Likelihood Estimation) over multiple trials. Spencer begins by showing the assumptions being made for experiments in this slecture. The denotions for samples, parametric model and the true value for the parameter are introduced. From those denotions, the author constructs joint density function first in the form of products and next in the form of sums by taking the natural logarithms. By using the log-likelihood function, Spencer now computes the MLE. Three properties of ML Estimator is introduced and they are consistency, asymptotic Normality, and efficiency. Next, effectiveness of MLE is evaluated by conducting experiments with various sample sizes N and various number of trials. From the experiments, increasing the sample size increases the confidence of MLE with a logarithmic relationship is shown as a graph. When the sample size is low, it is also shown that calculating the mean of MLE values over several independent trials provides a more accurate estimation. Lastly, the Kullback-Leibler Divergence ($ D_{KL} $) is introduced and it is shown how we can find $ D_{KL} $ from the MLE.

I have a comment on the Properties of ML Estimator section (4:03~6:02 in the video). It was good to have this kind of summary on the properties of MLE but just by watching the video, it is not clear to me whether those properties are only for the case when the dataset is Gaussian distribution or they are for all the cases in general.

Overall, I like the organization of this slecture for its step-by-step descriptions and properly mentioning on the important points. The flow of the lecture is very smooth and easy to follow.



Back to ECE 662 S14 course wiki

Back to ECE 662 course page

Alumni Liaison

Correspondence Chess Grandmaster and Purdue Alumni

Prof. Dan Fleetwood