Metascience (research)

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Metascience (also known as meta-research) is the use of scientific methodology to study science itself. Metascience seeks to increase the quality of scientific research while reducing waste. It is also known as "research on research" and "the science of science", as it uses research methods to study how research is done and where improvements can be made. Metascience concerns itself with all fields of research and has been described as "a bird's eye view of science."[1] In the words of John Ioannidis, "Science is the best thing that has happened to human beings ... but we can do it better."[2]


Metascience has grown as a reaction to the replication crisis and to concerns about waste in research.[3] The earliest meta-research paper, published in 1966, examined the statistical methods of 295 papers published in ten high-profile medical journals in 1964. It found that, "in almost 73% of the reports read [...] conclusions were drawn when the justification for these conclusions was invalid."[4]

In medical and health research, concerns have been raised about waste due to publication bias, inadequate research reporting, and poor study design, as in the case of inadequate blinding. It is estimated that currently 85% of the worldwide budget in medical and health research is wasted.[5] Multiple empirical studies have found the same percentage in a range of fields that have attempted to reproduce published peer-reviewed research results and have failed to do so on 75–90% of occasions.[6]

Many prominent science publishers are interested in meta-research and in improving the quality of their publications. Many concerns about waste in medical and health research were described in a 2014 Lancet special issue on 'Research: increasing value, reducing waste'. Nature includes an ongoing special section on "Challenges in irreproducible research", which had published 12 editorials as of August 2017. Science has had an editorial, [7] a policy forum,[8] and a special issue[9] on meta-research and the problems with reproducibility. In 2012 PLOS ONE launched a Reproducibility Initiative. In 2016 PLOS Biology included a section for papers on meta-research.[10] In 2015 Biomed Central introduced a minimum-standards-of-reporting checklist to four titles.

The first international conference in the broad area of meta-research was the Research Waste/EQUATOR conference held in Edinburgh in 2015; the first international conference on peer review was the Peer Review Congress held in 1989.[11] The first journal specifically targeting meta-research was Research Integrity and Peer Review, launched in 2016. The journal's opening editorial called for "research that will increase our understanding and suggest potential solutions to issues related to peer review, study reporting, and research and publication ethics".[12]



A focus of metascientific research in medicine has been blinding. Blinding is poorly reported in medical literature, and the terminology used to describe it is ambiguous. This has resulted in widespread misunderstanding of the subject. Furthermore, the success or failure of a blind is rarely measured or reported, leading to significant hidden bias.[13] Research showing the failure of blinding in antidepressant trials has led some scientists to argue that antidepressants are no better than placebo.[14]

Studies have also been conducted into the influence of pharmaceutical companies on medical research. These studies have found higher rates of reported positive results in the presence of financial conflicts of interest.


The largest body of metascientific research in psychology involves the replication crisis. Over half of psychological studies cannot be replicated.[15]

Applied statistics[edit]

Discourse around use of statistics in science has focused largely on abuse of p-values and misuse of statistical significance.[16]


Richard Feynman noted that estimates of physical constants were closer to published values than would be expected by chance. Physicists now implement blinding to prevent confirmation bias.[17]



Scientometrics concerns itself with measuring bibliographic data in scientific publications. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts.[18]

Scientific data science[edit]

Scientific data science is the use of data science to analyse research papers. It encompasses both qualitative and quantitative methods. Research in scientific data science includes fraud detection[19] and citation network analysis.[20]

See also[edit]


  1. ^ Ioannidis, John P. A.; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N. (2015-10-02). "Meta-research: Evaluation and Improvement of Research Methods and Practices". PLOS Biology. 13 (10): –1002264. doi:10.1371/journal.pbio.1002264. ISSN 1545-7885. Retrieved 2019-04-28.
  2. ^ Bach, Author Becky (8 December 2015). "On communicating science and uncertainty: A podcast with John Ioannidis". Scope. Retrieved 20 May 2019.
  3. ^ "Researching the researchers". Nature Genetics. 46 (5): 417. 2014. doi:10.1038/ng.2972. ISSN 1061-4036. PMID 24769715.
  4. ^ Schor, Stanley (1966). "Statistical Evaluation of Medical Journal Manuscripts". JAMA: The Journal of the American Medical Association. 195 (13): 1123. doi:10.1001/jama.1966.03100130097026. ISSN 0098-7484.
  5. ^ Chalmers, Iain; Glasziou, Paul (2009). "Avoidable waste in the production and reporting of research evidence". The Lancet. 374 (9683): 86–89. doi:10.1016/S0140-6736(09)60329-9. ISSN 0140-6736. PMID 19525005.
  6. ^ Begley, C. G.; Ioannidis, J. P. A. (2014). "Reproducibility in Science: Improving the Standard for Basic and Preclinical Research". Circulation Research. 116 (1): 116–126. doi:10.1161/CIRCRESAHA.114.303819. ISSN 0009-7330. PMID 25552691.
  7. ^ Buck, S. (2015). "Solving reproducibility". Science. 348 (6242): 1403. doi:10.1126/science.aac8041. ISSN 0036-8075. PMID 26113692.
  8. ^ Alberts, B.; Cicerone, R. J.; Fienberg, S. E.; Kamb, A.; McNutt, M.; Nerem, R. M.; Schekman, R.; Shiffrin, R.; Stodden, V.; Suresh, S.; Zuber, M. T.; Pope, B. K.; Jamieson, K. H. (2015). "Self-correction in science at work". Science. 348 (6242): 1420–1422. doi:10.1126/science.aab3847. ISSN 0036-8075. PMID 26113701.
  9. ^ Enserink, Martin (2018). "Research on research". Science. 361 (6408): 1178–1179. doi:10.1126/science.361.6408.1178. ISSN 0036-8075. PMID 30237336.
  10. ^ Kousta, Stavroula; Ferguson, Christine; Ganley, Emma (2016). "Meta-Research: Broadening the Scope of PLOS Biology". PLOS Biology. 14 (1): e1002334. doi:10.1371/journal.pbio.1002334. ISSN 1545-7885. PMC 4699700. PMID 26727031.
  11. ^ Rennie, Drummond (1990). "Editorial Peer Review in Biomedical Publication". JAMA. 263 (10): 1317–1441. doi:10.1001/jama.1990.03440100011001. ISSN 0098-7484. PMID 2304208.
  12. ^ Harriman, Stephanie L.; Kowalczuk, Maria K.; Simera, Iveta; Wager, Elizabeth (2016). "A new forum for research on research integrity and peer review". Research Integrity and Peer Review. 1 (1): 5. doi:10.1186/s41073-016-0010-y. ISSN 2058-8615. PMC 5794038. PMID 29451544.
  13. ^ Tuleu, Catherine; Legay, Helene; Orlu-Gul, Mine; Wan, Mandy (1 September 2013). "Blinding in pharmacological trials: the devil is in the details". Archives of Disease in Childhood. 98 (9): 656–659. doi:10.1136/archdischild-2013-304037. ISSN 0003-9888. Retrieved 8 May 2019.
  14. ^ Ioannidis, John PA (27 May 2008). "Effectiveness of antidepressants: an evidence myth constructed from a thousand randomized trials?". Philosophy, ethics, and humanities in medicine : PEHM. 3: 14. doi:10.1186/1747-5341-3-14. ISSN 1747-5341. Retrieved 23 April 2019.
  15. ^ Baker, Monya. "Over half of psychology studies fail reproducibility test". Nature News. doi:10.1038/nature.2015.18248. Retrieved 9 May 2019.
  16. ^ Check Hayden, Erika. "Weak statistical standards implicated in scientific irreproducibility". Nature News. doi:10.1038/nature.2013.14131. Retrieved 9 May 2019.
  17. ^ MacCoun, Robert; Perlmutter, Saul (8 October 2015). "Blind analysis: Hide results to seek the truth". Nature News. p. 187. doi:10.1038/526187a. Retrieved 9 May 2019.
  18. ^ Leydesdorff, L. and Milojevic, S., "Scientometrics" arXiv:1208.4566 (2013), forthcoming in: Lynch, M. (editor), International Encyclopedia of Social and Behavioral Sciences subsection 85030. (2015)
  19. ^ Markowitz, David M.; Hancock, Jeffrey T. (2016). "Linguistic obfuscation in fraudulent science". Journal of Language and Social Psychology. 35 (4): 435–445. doi:10.1177/0261927X15614605.
  20. ^ Ding, Y. (2010). "Applying weighted PageRank to author citation networks". Journal of the American Society for Information Science and Technology. 62 (2): 236–245. arXiv:1102.1760. doi:10.1002/asi.21452.

Further reading[edit]

External links[edit]