Jump to content

Box's M test

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 32.212.26.157 (talk) at 22:26, 13 November 2020 (redundant phrasing removed s/ Applied Statistics: From Bivariate Through Multivariate Techniques:From Bivariate Through Multivariate Techniques/Applied Statistics: From Bivariate Through Multivariate Techniques.). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Box's M test is a multivariate statistical test used to check the equality of multiple variance-covariance matrices.[1] The test is commonly used to test the assumption of homogeneity of variances and covariances in MANOVA and linear discriminant analysis. It is named after George E. P. Box who first discussed the test in 1949. The test uses a chi-squared approximation.

Box's M test is susceptible to errors if the data does not meet model assumptions or if the sample size is too large or small.[2] Box's M test is especially prone to error if the data does not meet the assumption of multivariate normality.[3]

See also

References

  1. ^ Box, G.E.P. (1 December 1949). "A General Distribution Theory for a Class of Likelihood Criteria". Biometrika. 36 (3–4): 317–346. doi:10.1093/biomet/36.3-4.317.
  2. ^ Rebecca M. Warner (2013). Applied Statistics: From Bivariate Through Multivariate Techniques. SAGE. p. 778. ISBN 978-1-4129-9134-6.
  3. ^ Bryan F.J. Manly (6 July 2004). Multivariate Statistical Methods: A Primer, Third Edition. CRC Press. p. 54. ISBN 978-1-58488-414-9.