Revision as of 00:42, 17 April 2008 by Ebernard (Talk)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

In statistics, overfitting means that some of the relationships that appear statistically significant are actually just noise. A model with overfitting has much more freedom degrees than the data that we have. Consequently, this model doesn't replicate well and does a lousy job when predicting future responses.

In machine learning, overfiting training occurs when the training is too long or the training set is rare. As a result, the learner may adjust to very specific random features that has no causal relation to the target function. This may lead to the case when the training error is reducing while the testing is increasing.

Alumni Liaison

Ph.D. 2007, working on developing cool imaging technologies for digital cameras, camera phones, and video surveillance cameras.

Buyue Zhang