Minimum-variance unbiased estimator: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
m change cat Category:Estimation--> Category:Estimation theory after move request
Deleting nonsense. I already deleted this assertion before; whether from this article or a closely related one I don't know. Is someone teaching this crap in some signal-processing course?
Line 2: Line 2:
The MVUE is only valid for [[bias (statistics)|unbiased]] estimators and results when the [[variance]] of the estimator is minimized ''for all values of the parameters''.
The MVUE is only valid for [[bias (statistics)|unbiased]] estimators and results when the [[variance]] of the estimator is minimized ''for all values of the parameters''.
If the variance isn't minimized for all values of the parameters, then it's not the MVUE.
If the variance isn't minimized for all values of the parameters, then it's not the MVUE.

The MVUE is equivalent to the [[minimum mean squared error]] (MMSE) since the bias is constrained to zero.
In other words, if the bias is zero then the MMSE is the MVUE (if it exists).


:<math>
:<math>

Revision as of 00:00, 10 March 2005

Minimum variance unbiased estimator (MVUE or MVU estimator) is an estimator of parameters derived from estimation theory. The MVUE is only valid for unbiased estimators and results when the variance of the estimator is minimized for all values of the parameters. If the variance isn't minimized for all values of the parameters, then it's not the MVUE.

now constricting the bias to zero

and minimizing the MSE is equivalent to the MVUE since it is minimizing the variance of an unbiased estimator.