(7 intermediate revisions by the same user not shown)
Line 4: Line 4:
  
 
Jeffrey's prior, a subset of Bayesian Statistics, is an objective prior distribution for a parameter space. A prior distribution is the probability distribution
 
Jeffrey's prior, a subset of Bayesian Statistics, is an objective prior distribution for a parameter space. A prior distribution is the probability distribution
that expresses ones belief about the distribution before some evidence is taken into account. The density function of Jeffrey's prior is proportional to the square root of the determinant of the Fisher's Information matrix. This can be shown as: <math> p(θ) \propto \sqrt{det(I(θ)}
+
that expresses ones belief about the distribution before some evidence is taken into account. The density function of Jeffrey's prior is proportional to the square root of the determinant of the Fisher's Information matrix. This can be shown as: <math> p({\vec  \theta }) \propto \sqrt{detI({\vec  \theta })} </math>
 +
 
 +
One feature of this prior is that it is invariant for a change of coordinates of the vector <math>{\vec \theta}</math>. This means that the relative probability assigned to the volume of the probability space will not change due to the parametrization used to find Jeffrey's Prior. The use of Jeffrey's prior violates the likelihood principle, which is the proposition that, given a model, all the evidence in a sample relevant to the model parameters is contained in the likelihood function.
 +
 
 +
When using Jeffrey's Prior, inferences about <math>{\vec \theta}</math> do not only depend on the probability of the data as a function of <math>{\vec \theta}</math> but on all possible experimental outcomes, because the Fisher information is computed from an expectation over the given universe.
 +
 
 +
 
 +
 
 +
[[Walther_MA271_Fall2020_topic13|Back To Fisher information]]
 +
 
 +
[[Category:MA271Fall2020Walther]]

Latest revision as of 21:32, 6 December 2020

Jeffrey's Prior

What is Jeffrey's Prior?

Jeffrey's prior, a subset of Bayesian Statistics, is an objective prior distribution for a parameter space. A prior distribution is the probability distribution that expresses ones belief about the distribution before some evidence is taken into account. The density function of Jeffrey's prior is proportional to the square root of the determinant of the Fisher's Information matrix. This can be shown as: $ p({\vec \theta }) \propto \sqrt{detI({\vec \theta })} $

One feature of this prior is that it is invariant for a change of coordinates of the vector $ {\vec \theta} $. This means that the relative probability assigned to the volume of the probability space will not change due to the parametrization used to find Jeffrey's Prior. The use of Jeffrey's prior violates the likelihood principle, which is the proposition that, given a model, all the evidence in a sample relevant to the model parameters is contained in the likelihood function.

When using Jeffrey's Prior, inferences about $ {\vec \theta} $ do not only depend on the probability of the data as a function of $ {\vec \theta} $ but on all possible experimental outcomes, because the Fisher information is computed from an expectation over the given universe.


Back To Fisher information

Alumni Liaison

Basic linear algebra uncovers and clarifies very important geometry and algebra.

Dr. Paul Garrett