(New page: <center><font size="4"></font> <font size="4">Comments for Introduction to Maximum Likelihood Estimation </font> A [https://www.projectrhea.org/learning/slectures.php slecture] by ...)
 
 
(One intermediate revision by the same user not shown)
Line 10: Line 10:
 
[https://kiwi.ecn.purdue.edu/rhea/index.php/MLEforGMM Back to Introduction to Maximum Likelihood Estimation]  
 
[https://kiwi.ecn.purdue.edu/rhea/index.php/MLEforGMM Back to Introduction to Maximum Likelihood Estimation]  
  
This is the talk page for the sLecture notes on Introduction to Maximum Likelihood Estimation. Please leave me a comment below if you have any questions, if you notice any errors or if you would like to discuss a topic further.<br>  
+
This is the talk page for the sLecture notes on Introduction to Maximum Likelihood Estimation. Please leave me a comment below if you have any questions, if you notice any errors or if you would like to discuss a topic further.<br>
 +
<br>
  
 
----
 
----
 +
 +
[Review by Jihwan Lee] This slecture is talking about Maximum Likelihood Estimation(MLE) which is a widely used technique for density estimation. The author first introduces the basic principle of MLE nicely and then provide some practical considerations. The author explains why we need to take log-likelihood, why we do not have to take into account constant values, and what numerical issues might arise in MLE. All these stuffs are written well so that readers can understand and follow this slecture easily. The author also shows how parameters can be estimated using MLE approach with some example distributions, which will help readers better understand how MLE works. Lastly, MLE for Gaussian Mixture Model is presented and EM(Expectation and Maximization) algorithm is introduced for the case of a model in which parameters we need to estimates are existing contains some unobserved latent variables.
 +
 +
* Good points
 +
** Overall, this slecture is well written and helps a reader better understand how MLE works by showing examples.
 +
 +
* Suggestions
 +
** There are some typos :)
 +
** In section 4, for each distribution, if you explain explicitly what parameters should be estimated, then all the derivations will be much clearer to readers with lack of background.
 +
** In the last section, it would be better to explain when EM algorithm should be used more detailedly. Some readers who do not have a background on this area might be confused about the concept of latent variables. Also, what should be done precisely at each step of EM algorithm could be abstract to those readers. So, this slecture could be much better if you provide a small example of GMM(two Gaussians will be enough) and show how each step works for only the first few iterations.
  
 
== Questions and Comments:  ==
 
== Questions and Comments:  ==

Latest revision as of 09:59, 3 May 2014

Comments for Introduction to Maximum Likelihood Estimation

A slecture by ECE student Wen Yi



Back to Introduction to Maximum Likelihood Estimation

This is the talk page for the sLecture notes on Introduction to Maximum Likelihood Estimation. Please leave me a comment below if you have any questions, if you notice any errors or if you would like to discuss a topic further.


[Review by Jihwan Lee] This slecture is talking about Maximum Likelihood Estimation(MLE) which is a widely used technique for density estimation. The author first introduces the basic principle of MLE nicely and then provide some practical considerations. The author explains why we need to take log-likelihood, why we do not have to take into account constant values, and what numerical issues might arise in MLE. All these stuffs are written well so that readers can understand and follow this slecture easily. The author also shows how parameters can be estimated using MLE approach with some example distributions, which will help readers better understand how MLE works. Lastly, MLE for Gaussian Mixture Model is presented and EM(Expectation and Maximization) algorithm is introduced for the case of a model in which parameters we need to estimates are existing contains some unobserved latent variables.

  • Good points
    • Overall, this slecture is well written and helps a reader better understand how MLE works by showing examples.
  • Suggestions
    • There are some typos :)
    • In section 4, for each distribution, if you explain explicitly what parameters should be estimated, then all the derivations will be much clearer to readers with lack of background.
    • In the last section, it would be better to explain when EM algorithm should be used more detailedly. Some readers who do not have a background on this area might be confused about the concept of latent variables. Also, what should be done precisely at each step of EM algorithm could be abstract to those readers. So, this slecture could be much better if you provide a small example of GMM(two Gaussians will be enough) and show how each step works for only the first few iterations.

Questions and Comments:

  • Additional Questions / Comments

Back to ECE 662 S14 course wiki

Back to ECE 662 course page

Alumni Liaison

To all math majors: "Mathematics is a wonderfully rich subject."

Dr. Paul Garrett