In statistics, the Lehmann–Scheffé theorem is prominent in mathematical statistics, tying together the ideas of completeness, sufficiency, uniqueness, and best unbiased estimation. The theorem states that any estimator which is unbiased for a given unknown quantity and which is based on only a complete, sufficient statistic (and on no other data-derived values) is the unique best unbiased estimator of that quantity. The Lehmann–Scheffé theorem is named after Erich Leo Lehmann and Henry Scheffé, given their two early papers.
Formally, if T is a complete sufficient statistic for θ and E(g(T)) = τ(θ) then g(T) is the minimum-variance unbiased estimator (MVUE) of τ(θ).
|This article needs additional citations for verification. (April 2011)|
- Casella, George (2001). Statistical Inference. Duxbury Press. p. 369. ISBN 0-534-24312-6.
- Lehmann, E.L.; Scheffé, H. (1950). "Completeness, similar regions, and unbiased estimation. I.". Sankhyā 10 (4): 305–340. JSTOR 25048038. MR 39201.
- Lehmann, E.L.; Scheffé, H. (1955). "Completeness, similar regions, and unbiased estimation. II.". Sankhyā 15 (3): 219–236. JSTOR 25048243. MR 72410.
|This statistics-related article is a stub. You can help Wikipedia by expanding it.|