(Created page with "The formal definition of Fischer Information is: "For random variable X, with a likelihood function <math>L(θ,X)</math> and score function(with respect to parameter θ) <math...") |
|||
Line 1: | Line 1: | ||
− | The formal definition of Fischer Information is: | + | The formal definition of Fischer Information is:<br /> |
− | "For random variable X, with a likelihood function <math>L(θ,X)</math> and score function(with respect to parameter θ) <math>s(θ;X) = \nabla [ln(L(θ,X))]</math> | + | |
+ | "For random variable X, with a likelihood function <math>L(θ,X)</math> and score function(with respect to parameter θ) <math>s(θ;X) = \nabla [ln(L(θ,X))]</math>(Rothman)<br /> | ||
+ | In more understandable terms this just means that for some probability density function, the amount of information stored in the parameter θ, or Fischer information <math>I(θ)</math> is equal to the variance of the score. |
Revision as of 19:36, 6 December 2020
The formal definition of Fischer Information is:
"For random variable X, with a likelihood function $ L(θ,X) $ and score function(with respect to parameter θ) $ s(θ;X) = \nabla [ln(L(θ,X))] $(Rothman)
In more understandable terms this just means that for some probability density function, the amount of information stored in the parameter θ, or Fischer information $ I(θ) $ is equal to the variance of the score.