Metascience: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
→‎Medicine: "effects" ---> "affects"
Line 21: Line 21:


===Psychology===
===Psychology===
{{Further information|Replication crisis|Psychology#Contemporary_issues_in_methodology_and_practice}}
{{expand section|date=May 2019}}
Meta-research has revealed significant problems in psychology. Psychological research suffers from high bias,<ref>{{cite journal |last1=Franco |first1=Annie |last2=Malhotra |first2=Neil |last3=Simonovits |first3=Gabor |title=Underreporting in Psychology Experiments: Evidence From a Study Registry |journal=Social Psychological and Personality Science |date=1 January 2016 |volume=7 |issue=1 |pages=8–12 |doi=10.1177/1948550615598377 |url=https://journals.sagepub.com/doi/abs/10.1177/1948550615598377 |accessdate=24 May 2019 |language=en |issn=1948-5506}}</ref> low reproducibility,<ref>{{cite journal |last1=Munafò |first1=Marcus |title=Metascience: Reproducibility blues |journal=Nature |date=29 March 2017 |volume=543 |pages=619–620 |doi=10.1038/543619a |url=https://www.nature.com/articles/543619a |accessdate=24 May 2019 |language=en |issn=1476-4687}}</ref> and widespread misuse use of statistics.<ref>{{cite web |last1=StokstadSep. 20 |first1=Erik |title=This research group seeks to expose weaknesses in science—and they’ll step on some toes if they have to |url=https://www.sciencemag.org/news/2018/09/research-group-seeks-expose-weaknesses-science-and-they-ll-step-some-toes-if-they-have |website=Science {{!}} AAAS |accessdate=24 May 2019 |language=en |date=19 September 2018}}</ref> The replication crisis effects psychology more strongly than any other field; as many as two-thirds of highly publicized findings may be impossible to replicate.<ref>{{cite journal |doi=10.1126/science.aac4716 |pmid=26315443 |title=Estimating the reproducibility of psychological science |journal=Science |volume=349 |issue=6251 |pages=aac4716 |year=2015 |url=http://eprints.keele.ac.uk/877/1/Open%20Science%20%28Science%20Pre-Print%29.pdf }}</ref> The practice of [[registered report|pre-registration]] seeks to reduce publication bias and [[misuse of statistics|abuse of statistics]].<ref>{{cite web |last1=Aschw |first1=Christie |title=Psychology’s Replication Crisis Has Made The Field Better |url=https://fivethirtyeight.com/features/psychologys-replication-crisis-has-made-the-field-better/ |website=FiveThirtyEight |accessdate=24 May 2019 |date=6 December 2018}}</ref>
The largest body of metascientific research in psychology involves the [[replication crisis]]. Over half of psychological studies cannot be replicated.<ref>{{cite web |last1=Baker |first1=Monya |title=Over half of psychology studies fail reproducibility test |url=https://www.nature.com/news/over-half-of-psychology-studies-fail-reproducibility-test-1.18248 |website=Nature News |accessdate=9 May 2019 |language=en |doi=10.1038/nature.2015.18248}}</ref>


===Applied statistics===
===Applied statistics===

Revision as of 17:24, 24 May 2019

Metascience (also known as meta-research) is the use of scientific methodology to study science itself. Metascience seeks to increase the quality of scientific research while reducing waste. It is also known as "research on research" and "the science of science", as it uses research methods to study how research is done and where improvements can be made. Metascience concerns itself with all fields of research and has been described as "a bird's eye view of science."[1] In the words of John Ioannidis, "Science is the best thing that has happened to human beings ... but we can do it better."[2]

History

Metascience has grown as a reaction to the replication crisis and to concerns about waste in research.[3] The earliest meta-research paper, published in 1966, examined the statistical methods of 295 papers published in ten high-profile medical journals in 1964. It found that, "in almost 73% of the reports read [...] conclusions were drawn when the justification for these conclusions was invalid."[4]

In medical and health research, concerns have been raised about waste due to publication bias, inadequate research reporting, and poor study design, as in the case of inadequate blinding. It is estimated that currently 85% of the worldwide budget in medical and health research is wasted.[5] Multiple empirical studies have found the same percentage in a range of fields that have attempted to reproduce published peer-reviewed research results and have failed to do so on 75–90% of occasions.[6]

Many prominent science publishers are interested in meta-research and in improving the quality of their publications. Many concerns about waste in medical and health research were described in a 2014 Lancet special issue on 'Research: increasing value, reducing waste'. Nature includes an ongoing special section on "Challenges in irreproducible research", which had published 12 editorials as of August 2017. Science has had an editorial, [7] a policy forum,[8] and a special issue[9] on meta-research and the problems with reproducibility. In 2012 PLOS ONE launched a Reproducibility Initiative. In 2016 PLOS Biology included a section for papers on meta-research.[10] In 2015 Biomed Central introduced a minimum-standards-of-reporting checklist to four titles.

The first international conference in the broad area of meta-research was the Research Waste/EQUATOR conference held in Edinburgh in 2015; the first international conference on peer review was the Peer Review Congress held in 1989.[11] The first journal specifically targeting meta-research was Research Integrity and Peer Review, launched in 2016. The journal's opening editorial called for "research that will increase our understanding and suggest potential solutions to issues related to peer review, study reporting, and research and publication ethics".[12]

Applications

Medicine

Clinical research in medicine is often of low quality, and many studies cannot be replicated.[13][14] An estimated 85% of research funding is wasted.[5] Additionally, the presence of bias affects research quality.[15] The pharmaceutical industry exerts substantial influence on the design and execution of medical research. Conflicts of interest are common among authors of medical literature[16] and among editors of medical journals. While almost all medical journals require their authors to disclose conflicts of interest, editors are not required to do so.[17] Financial conflicts of interest have been linked to higher rates of positive study results. In antidepressant trials, pharmaceutical sponsorship is the best predictor of trial outcome.[18]

Blinding is another focus of meta-research, as error caused by poor blinding is a source of experimental bias. Blinding is not well reported in medical literature, and widespread misunderstanding of the subject has resulted in poor implementation of blinding in clinical trials.[19] Further, the success or failure of a blind is rarely measured or reported.[20] Research showing failures of blinding in antidepressant trials has led some scientists to argue that antidepressants are no better than placebo.[21] In light of meta-research showing failure of blinding, CONSORT standards recommend that all clinical trials assess and report the quality of blinding.[22]

Psychology

Meta-research has revealed significant problems in psychology. Psychological research suffers from high bias,[23] low reproducibility,[24] and widespread misuse use of statistics.[25] The replication crisis effects psychology more strongly than any other field; as many as two-thirds of highly publicized findings may be impossible to replicate.[26] The practice of pre-registration seeks to reduce publication bias and abuse of statistics.[27]

Applied statistics

Discourse around use of statistics in science has focused largely on abuse of statistics. In particular, discussion has centered around the misuse of p-values and statistical significance.[28]

Physics

Richard Feynman noted that estimates of physical constants were closer to published values than would be expected by chance. Physicists now implement blinding to prevent confirmation bias.[29]

Sub-fields

Scientometrics

Scientometrics concerns itself with measuring bibliographic data in scientific publications. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts.[30]

Scientific data science

Scientific data science is the use of data science to analyse research papers. It encompasses both qualitative and quantitative methods. Research in scientific data science includes fraud detection[31] and citation network analysis.[32]

See also

References

  1. ^ Ioannidis, John P. A.; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N. (2015-10-02). "Meta-research: Evaluation and Improvement of Research Methods and Practices". PLOS Biology. 13 (10): –1002264. doi:10.1371/journal.pbio.1002264. ISSN 1545-7885. Retrieved 2019-04-28.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  2. ^ Bach, Author Becky (8 December 2015). "On communicating science and uncertainty: A podcast with John Ioannidis". Scope. Retrieved 20 May 2019. {{cite web}}: |first1= has generic name (help)
  3. ^ "Researching the researchers". Nature Genetics. 46 (5): 417. 2014. doi:10.1038/ng.2972. ISSN 1061-4036. PMID 24769715.
  4. ^ Schor, Stanley (1966). "Statistical Evaluation of Medical Journal Manuscripts". JAMA: The Journal of the American Medical Association. 195 (13): 1123. doi:10.1001/jama.1966.03100130097026. ISSN 0098-7484.
  5. ^ a b Chalmers, Iain; Glasziou, Paul (2009). "Avoidable waste in the production and reporting of research evidence". The Lancet. 374 (9683): 86–89. doi:10.1016/S0140-6736(09)60329-9. ISSN 0140-6736. PMID 19525005.
  6. ^ Begley, C. G.; Ioannidis, J. P. A. (2014). "Reproducibility in Science: Improving the Standard for Basic and Preclinical Research". Circulation Research. 116 (1): 116–126. doi:10.1161/CIRCRESAHA.114.303819. ISSN 0009-7330. PMID 25552691.
  7. ^ Buck, S. (2015). "Solving reproducibility". Science. 348 (6242): 1403. doi:10.1126/science.aac8041. ISSN 0036-8075. PMID 26113692.
  8. ^ Alberts, B.; Cicerone, R. J.; Fienberg, S. E.; Kamb, A.; McNutt, M.; Nerem, R. M.; Schekman, R.; Shiffrin, R.; Stodden, V.; Suresh, S.; Zuber, M. T.; Pope, B. K.; Jamieson, K. H. (2015). "Self-correction in science at work". Science. 348 (6242): 1420–1422. doi:10.1126/science.aab3847. ISSN 0036-8075. PMID 26113701.
  9. ^ Enserink, Martin (2018). "Research on research". Science. 361 (6408): 1178–1179. doi:10.1126/science.361.6408.1178. ISSN 0036-8075. PMID 30237336.
  10. ^ Kousta, Stavroula; Ferguson, Christine; Ganley, Emma (2016). "Meta-Research: Broadening the Scope of PLOS Biology". PLOS Biology. 14 (1): e1002334. doi:10.1371/journal.pbio.1002334. ISSN 1545-7885. PMC 4699700. PMID 26727031.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  11. ^ Rennie, Drummond (1990). "Editorial Peer Review in Biomedical Publication". JAMA. 263 (10): 1317–1441. doi:10.1001/jama.1990.03440100011001. ISSN 0098-7484. PMID 2304208.
  12. ^ Harriman, Stephanie L.; Kowalczuk, Maria K.; Simera, Iveta; Wager, Elizabeth (2016). "A new forum for research on research integrity and peer review". Research Integrity and Peer Review. 1 (1): 5. doi:10.1186/s41073-016-0010-y. ISSN 2058-8615. PMC 5794038. PMID 29451544.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  13. ^ Ioannidis, JPA (2016). "Why Most Clinical Research Is Not Useful". PLoS Med. 13 (6): e1002049. doi:10.1371/journal.pmed.1002049. PMC 4915619. PMID 27328301.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  14. ^ Ioannidis JA (13 July 2005). "Contradicted and initially stronger effects in highly cited clinical research". JAMA. 294 (2): 218–228. doi:10.1001/jama.294.2.218. PMID 16014596.
  15. ^ June 24, Jeremy Hsu; ET, Jeremy Hsu. "Dark Side of Medical Research: Widespread Bias and Omissions". Live Science. Retrieved 24 May 2019. {{cite web}}: Cite has empty unknown parameters: |1= and |2= (help)CS1 maint: numeric names: authors list (link)
  16. ^ "Confronting conflict of interest". Nature Medicine. 24 (11): 1629. November 2018. doi:10.1038/s41591-018-0256-7. ISSN 1546-170X. Retrieved 22 May 2019.
  17. ^ Haque, Waqas; Minhajuddin, Abu; Gupta, Arjun; Agrawal, Deepak (2018). "Conflicts of interest of editors of medical journals". PloS One. 13 (5): e0197141. doi:10.1371/journal.pone.0197141. ISSN 1932-6203. PMID 29775468. Retrieved 22 May 2019.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  18. ^ Moncrieff, J (March 2002). "The antidepressant debate". The British journal of psychiatry : the journal of mental science. 180: 193–4. ISSN 0007-1250. PMID 11872507. Retrieved 22 May 2019.
  19. ^ Bello, S; Moustgaard, H; Hróbjartsson, A (October 2014). "The risk of unblinding was infrequently and incompletely reported in 300 randomized clinical trial publications". Journal of clinical epidemiology. 67 (10): 1059–69. doi:10.1016/j.jclinepi.2014.05.007. ISSN 1878-5921. PMID 24973822. Retrieved 22 May 2019.
  20. ^ Tuleu, Catherine; Legay, Helene; Orlu-Gul, Mine; Wan, Mandy (1 September 2013). "Blinding in pharmacological trials: the devil is in the details". Archives of Disease in Childhood. 98 (9): 656–659. doi:10.1136/archdischild-2013-304037. ISSN 0003-9888. Retrieved 8 May 2019.
  21. ^ Ioannidis, John PA (27 May 2008). "Effectiveness of antidepressants: an evidence myth constructed from a thousand randomized trials?". Philosophy, ethics, and humanities in medicine : PEHM. 3: 14. doi:10.1186/1747-5341-3-14. ISSN 1747-5341. Retrieved 23 April 2019.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  22. ^ Moher, David; Altman, Douglas G.; Schulz, Kenneth F. (24 March 2010). "CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials". BMJ. 340: c332. doi:10.1136/bmj.c332. ISSN 0959-8138. Retrieved 24 April 2019.
  23. ^ Franco, Annie; Malhotra, Neil; Simonovits, Gabor (1 January 2016). "Underreporting in Psychology Experiments: Evidence From a Study Registry". Social Psychological and Personality Science. 7 (1): 8–12. doi:10.1177/1948550615598377. ISSN 1948-5506. Retrieved 24 May 2019.
  24. ^ Munafò, Marcus (29 March 2017). "Metascience: Reproducibility blues". Nature. 543: 619–620. doi:10.1038/543619a. ISSN 1476-4687. Retrieved 24 May 2019.
  25. ^ StokstadSep. 20, Erik (19 September 2018). "This research group seeks to expose weaknesses in science—and they'll step on some toes if they have to". Science | AAAS. Retrieved 24 May 2019.{{cite web}}: CS1 maint: numeric names: authors list (link)
  26. ^ "Estimating the reproducibility of psychological science" (PDF). Science. 349 (6251): aac4716. 2015. doi:10.1126/science.aac4716. PMID 26315443.
  27. ^ Aschw, Christie (6 December 2018). "Psychology's Replication Crisis Has Made The Field Better". FiveThirtyEight. Retrieved 24 May 2019.
  28. ^ Check Hayden, Erika. "Weak statistical standards implicated in scientific irreproducibility". Nature News. doi:10.1038/nature.2013.14131. Retrieved 9 May 2019.
  29. ^ MacCoun, Robert; Perlmutter, Saul (8 October 2015). "Blind analysis: Hide results to seek the truth". Nature News. p. 187. doi:10.1038/526187a. Retrieved 9 May 2019.
  30. ^ Leydesdorff, L. and Milojevic, S., "Scientometrics" arXiv:1208.4566 (2013), forthcoming in: Lynch, M. (editor), International Encyclopedia of Social and Behavioral Sciences subsection 85030. (2015)
  31. ^ Markowitz, David M.; Hancock, Jeffrey T. (2016). "Linguistic obfuscation in fraudulent science". Journal of Language and Social Psychology. 35 (4): 435–445. doi:10.1177/0261927X15614605.
  32. ^ Ding, Y. (2010). "Applying weighted PageRank to author citation networks". Journal of the American Society for Information Science and Technology. 62 (2): 236–245. arXiv:1102.1760. doi:10.1002/asi.21452.

Further reading

External links

Journals