Revision as of 00:55, 17 April 2008 by Ebernard (Talk)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

The likelihood principle is a principle of statistical inference which asserts that all of the information in a sample is contained in the likelihood function. A likelihood function arises from a conditional probability distribution considered as a function of its second argument, holding the first fixed. Eg: consider a model which gives the probability density function of observable random variable X as a function of parameter O. Then for a specific value x of X, the function L(O|x) = P(X=x|O) is a likelihood function of O. It gives a measure of how likely any particular value of O is, if we know that X has a value x. Two likelihood functions are equivalent if one is a scalar multiple of the other.

Alumni Liaison

Sees the importance of signal filtering in medical imaging

Dhruv Lamba, BSEE2010