Talk:Minimum mean square error
|WikiProject Statistics||(Rated C-class, Low-importance)|
I've chosen to name this article minimum mean-square error since is seems to be the most frequent form encountered (on the web). Minimum mean-squared error only get appoximately half the hits when searching the web using google. Redirects from minimum mean-squared error, minimum mean square error and minimum mean squared error have been added to collect the most common forms of spelling. --Fredrik Orderud 00:42, 19 Apr 2005 (UTC)
- I've moved the article from Minimum mean square error to Minimum mean-square error since removing the dash makes the article title consistent with Mean square error, one of the provided references, and my Probability and Statistics textbook. ~MDD4696 13:42, 3 May 2007 (UTC)
I think this article needs a rewrite:
- MMSE is basically a Bayesian concept, since from a frequentist point of view there is no single minimum MSE estimator. This should be made clear right from the top.
- Relative efficiency is defined but never used. Is this relevant here?
- I was unable to understand the meaning or relevance of the discussion in the "Operational Considerations" section. Again, this article should deal only with the Bayesian viewpoint, with maybe a short reference and link to competing frequentist methods like UMVU estimators.
- The numeric example is nice, but it doesn't explain the underlying concepts. Three important points which can be illustrated with such an example are:
- The orthogonality principle
- The fact that the MMSE estimator is linear in the Gaussian case
- The general formula for a linear MMSE estimator
- I have completed the cleanup according to the above points, to the best of my ability and understanding. Any comments are welcome. --Zvika (talk) 19:08, 21 January 2008 (UTC)
I disagree. It is not just a Bayesian concept. I wrote a book on this a long time ago . There are lots of examples: the simplest is where coefficient of variation is known or can be estimated. Other examples include Stein estimation and Ridge regression.
The statement "The fact that the MMSE estimator is linear in the Gaussian case" surely shows a frequentist perspective? It is however incorrect IMHO. The 'proof' in the article relates to 'unbiassed estimators. However, as it relates to Gaussian data it is not really relevant to the general issue.
- Both Stein estimation and ridge regression are frequentist techniques. They are not MMSE -- that they do not achieve the lowest possible MSE. Perhaps our dispute is on wording: the James-Stein estimator is designed to reduce the MSE (compared with LS); it does not bring the MSE to a minimum, though. Indeed, in the frequentist setting you cannot minimize the MSE because improving the MSE for some values of the unknown parameter will invariably deteriorate performance for other values. So there is no unique minimum MSE.
- I do not see why you think that my quote about Gaussianity requires a frequentist perspective. The statement itself is correct and can be more accurately stated as follows: "In the jointly Gaussian case the MMSE is linear in the data" [Kay, Statistical Signal Processing, vol.1, p.350). There is no proof (with or without quotes) in the article, so I'm not really sure what exactly you're referring to. --Zvika (talk) 10:10, 28 February 2008 (UTC)
The use of terms such as minimum error and unbiased are hotly contended subjects, and
- MMSE is used in both Bayesian and Frequentist probability calculations -- see Talk:Minimum-variance unbiased estimator
- It should not, therefore, be removed from this page so that readers from both viewpoints don't start arguing to no end.
- The purpose of Operational Considerations was to clarify how each school of thought actually does the integral.
- Frequentists use the prior distribution of the statistic.
- Bayesians use the posterior distribution of the parameter.
Sorry, but I disagree completely. MVU and MMSE are entirely different concepts, as User:Michael Hardy correctly explained on Talk:Minimum-variance unbiased estimator. Furthermore, I don't think that this assertion is "hotly contended"; if you think otherwise, please provide a more accurate reference. I am not sure what Jaynes you are referring to; if you mean the book Probability Theory: The Logic of Science, then I could not find a reference to MMSE in the index. --Zvika (talk) 20:01, 5 February 2008 (UTC)
http://omega.albany.edu:8008/ETJ-PS/cc16u.ps pp. 9-10 And yes, MSE is a predominant concept in both camps, so it is not impossible for a frequentist to use the term minimum MSE -- especially in reference to estimator efficiency. Also, the use of the terms is hotly contended as implied above. Frobnitzem (talk) 20:25, 5 February 2008 (UTC)
- The term MSE is used in both a frequentist and Bayesian context, but means different things. In the frequentist context, the MSE is a function of θ, so it is not possible to talk about a minimum MSE (since one can never minimize the MSE for all θ simultaneously). Thus, in this article, which discusses minimum MSE, there is no possible frequentist interpretation. This does not imply anything about the validity of the frequentist point of view (which is an entirely legitimate point of view); it just says that such a point of view belongs elsewhere (e.g., the article on mean squared error).
- If you can provide a reliable source discussing the term "minimum MSE" in a frequentist context, then it should be placed here. Otherwise, I don't see why you think this term is "hotly contended", so I don't see why you have restored the old version.
- Finally, the link you placed above does not mention the term MSE at all. --Zvika (talk) 09:03, 6 February 2008 (UTC)