Talk:Minimum-variance unbiased estimator

From Wikipedia, the free encyclopedia
Jump to: navigation, search
WikiProject Statistics (Rated C-class, Top-importance)
WikiProject icon

This article is within the scope of the WikiProject Statistics, a collaborative effort to improve the coverage of statistics on Wikipedia. If you would like to participate, please visit the project page or join the discussion.

C-Class article C  This article has been rated as C-Class on the quality scale.
 Top  This article has been rated as Top-importance on the importance scale.
 

Michael Hardy wrote:

Deleting nonsense. I already deleted this assertion before; whether from this article or a closely related one I don't know. Is someone teaching this crap in some signal-processing course?

regarding [1]. Seriously, what's the problem here?

The MVUE is equivalent to the minimum mean squared error (MMSE) since the bias is constrained to zero.
In other words, if the bias is zero then the MMSE is the MVUE (if it exists).

If the estimator is unbiased then minimizing the variance is the same as minimizing the MSE, so it's both the MVUE and MMSE. Where exactly is the "crap" here? Cburnett 01:00, 10 Mar 2005 (UTC)

IF and estimator is unbiased.
Read the line above! That's far too big an "if"! Your assertion seems to be saying that the minimum-variance unbiased estimator has the smallest MSE of ANY estimator, since it has the smallest MSE of any UNBIASED estimator. In other words, you seem to be assuming that the estimator with the smallest MSE can only be an unbiased estimator.
Your words above make me suspect that you meant something else: you might have meant it has the smallest MSE of all unbiased estimators, rather than the smallest MSE of all estimators. If that's what you meant, then your meaning was completely unclear. Let's look at what you wrote:
The MVUE is equivalent to the minimum mean squared error (MMSE) since the bias is constrained to zero. In other words, if the bias is zero then the MMSE is the MVUE (if it exists).
Combining what I deleted from the article, immediately above, with your words on this discussion page, it does look as if that other meaning is what you had in mind. Apparently by "MMSE" you meant NOT "minimum mean squared error estimator" but rather "minimum mean square error unbiased estimator". If that's what you meant, then what you meant was right. But it's hard to tell that that's what you meant, given what you wrote, since when you wrote "minimum mean square error" you didn't wrote "mimumum mean squared error UNBIASED estimator".
In some cases, there are biased estimators that have far smaller mean squared error than the MSE of the best unbiased estimator. Michael Hardy 01:34, 10 Mar 2005 (UTC)
Yes, "you might have meant it has the smallest MSE of all unbiased estimators, rather than the smallest MSE of all estimators" is true and I have no problem admitting it was unclear, so can we cooperate to get something clear put into the article or do we have to keep dodging that and I can keep setting you up for more "teaching this crap in some signal-processing course" type of comments? Cburnett 05:28, 10 Mar 2005 (UTC)