Jump to content

Scoring algorithm

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Citation bot (talk | contribs) at 05:37, 1 August 2023 (Alter: doi-broken-date. | Use this bot. Report bugs. | #UCB_CommandLine). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Scoring algorithm, also known as Fisher's scoring,[1] is a form of Newton's method used in statistics to solve maximum likelihood equations numerically, named after Ronald Fisher.

Sketch of derivation

Let be random variables, independent and identically distributed with twice differentiable p.d.f. , and we wish to calculate the maximum likelihood estimator (M.L.E.) of . First, suppose we have a starting point for our algorithm , and consider a Taylor expansion of the score function, , about :

where

is the observed information matrix at . Now, setting , using that and rearranging gives us:

We therefore use the algorithm

and under certain regularity conditions, it can be shown that .

Fisher scoring

In practice, is usually replaced by , the Fisher information, thus giving us the Fisher Scoring Algorithm:

..

Under some regularity conditions, if is a consistent estimator, then (the correction after a single step) is 'optimal' in the sense that its error distribution is asymptotically identical to that of the true max-likelihood estimate.[2]

See also

References

  1. ^ Longford, Nicholas T. (1987). "A fast scoring algorithm for maximum likelihood estimation in unbalanced mixed models with nested random effects". Biometrika. 74 (4): 817–827. doi:10.1093/biomet/74.4.817.
  2. ^ Li, Bing; Babu, G. Jogesh (2019), "Bayesian Inference", Springer Texts in Statistics, New York, NY: Springer New York, Theorem 9.4, doi:10.1007/978-1-4939-9761-9_6, ISBN 978-1-4939-9759-6, S2CID 239322258, retrieved 2023-01-03

Further reading