Metascience
Meta-science refers to the systematic investigation of the scientific enterprise, in other words, the use of scientific methodology to study science itself. It is also known as "research on research" or "the science of science" as it uses research methods to study how research is done and where improvements can be made. It covers all fields of scientific research (including health and medical research) and has been described as "taking a bird’s eye view of science".[1] Meta-science aims to improve scientific practice as summed up by John Ioannidis, "Science is the best thing that has happened to human beings [...] but we can do it better".
History
Meta-science has grown as a reaction to the replication crisis and concerns about waste in research.[2] The earliest meta-research paper was published in 1966 and examined the statistical methods of 295 papers in ten high-profile medical journals in 1964, and found that, "in almost 73% of the reports read [...] conclusions were drawn when the justification for these conclusions was invalid."[3]
In health and medical research concerns have been raised about waste due to publication bias, inadequate research reporting, and poor study design, such as inadequate blinding. It is estimated that 85% of the worldwide research budget in health and medical research is currently wasted.[4] The 85% figure is supported by multiple empirical studies in a range of fields that have attempted to reproduce published peer reviewed research and failed on 75% to 90% of occasions.[5]
Many high-profile scientific publishers are interested in meta-research and improving the quality of their publications. Many of the concerns about waste in health and medical research were described in the 2014 Lancet special issue on 'Research: increasing value, reducing waste'. There is an ongoing special section in Nature on "Challenges in irreproducible research" with 12 editorials (as at August 2017). Science has had an editorial [6], a policy forum [7] and a special issue [8] on meta-research and the problems with reproducibility. PLOS ONE launched a Reproducibility Initiative in 2012. PLOS Biology included a section for papers on meta-research in 2016.[9] Biomed Central introduced a minimum standards of reporting checklist to four titles in 2015.
The first international conference in the broad area of meta-research was the Research Waste/EQUATOR conference in Edinburgh in 2015; the first international conference on peer review was the Peer Review Congress in 1989.[10] The first journal specifically targeting meta-research was Research Integrity and Peer Review launched in 2016. The journal's opening editorial called for, "research that will increase our understanding and suggest potential solutions to issues related to peer review, study reporting, and research and publication ethics".[11]
Metascience (Journal)
The journal Metascience is published by the Springer-Nature Group, and serves the metascientific research community. The journal published reviews of books in the history of science, philosophy of science, sociology of science, and science studies. The journal originated in Australia, and over the years has had a variety of editors, including Steven French, Stathis Psillos, and Theodore Arabatzis. It is now edited by K. Brad Wray and Luciano Boschiero.[12]
Scientific data science
Metascience encompasses both qualitative and quantitative methods. A sub-discipline of metascience that is unique to the modern era is scientific data science, or the use of data science to analyse research papers themselves. Research in scientific data science ranges from fraud detection,[13] to citation network analysis,[14] to developing novel metrics for assessing research impact.[15]
Meta-research centres
- Meta-Research Innovation Center at Stanford (METRICS) (co-directors John Ioannidis and Steven N. Goodman)
- Center for Open Science (director Brian Nosek)
- Meta-Research Center at Tilburg University
- Projet MiRoR Methods in Research on Research
- Centre for Journalology
- European Network for Knowledge Impact
- The Center for Transforming Biomedical Research of the Berlin Institute of Health
See also
- Epistemology
- Science policy
- Science of science policy
- Scientific method
- Sociology of knowledge
- Sociology of scientific knowledge
- Knowledge management
- Data science
- Reproducibility
References
- ^ Cite error: The named reference
IoannidisFanelli2015
was invoked but never defined (see the help page). - ^ "Researching the researchers". Nature Genetics. 46 (5): 417. 2014. doi:10.1038/ng.2972. ISSN 1061-4036. PMID 24769715.
- ^ Schor, Stanley (1966). "Statistical Evaluation of Medical Journal Manuscripts". JAMA: The Journal of the American Medical Association. 195 (13): 1123. doi:10.1001/jama.1966.03100130097026. ISSN 0098-7484.
- ^ Chalmers, Iain; Glasziou, Paul (2009). "Avoidable waste in the production and reporting of research evidence". The Lancet. 374 (9683): 86–89. doi:10.1016/S0140-6736(09)60329-9. ISSN 0140-6736. PMID 19525005.
- ^ Begley, C. G.; Ioannidis, J. P. A. (2014). "Reproducibility in Science: Improving the Standard for Basic and Preclinical Research". Circulation Research. 116 (1): 116–126. doi:10.1161/CIRCRESAHA.114.303819. ISSN 0009-7330. PMID 25552691.
- ^ Buck, S. (2015). "Solving reproducibility". Science. 348 (6242): 1403. doi:10.1126/science.aac8041. ISSN 0036-8075. PMID 26113692.
- ^ Alberts, B.; Cicerone, R. J.; Fienberg, S. E.; Kamb, A.; McNutt, M.; Nerem, R. M.; Schekman, R.; Shiffrin, R.; Stodden, V.; Suresh, S.; Zuber, M. T.; Pope, B. K.; Jamieson, K. H. (2015). "Self-correction in science at work". Science. 348 (6242): 1420–1422. doi:10.1126/science.aab3847. ISSN 0036-8075. PMID 26113701.
- ^ Enserink, Martin (2018). "Research on research". Science. 361 (6408): 1178–1179. doi:10.1126/science.361.6408.1178. ISSN 0036-8075. PMID 30237336.
- ^ Kousta, Stavroula; Ferguson, Christine; Ganley, Emma (2016). "Meta-Research: Broadening the Scope of PLOS Biology". PLOS Biology. 14 (1): e1002334. doi:10.1371/journal.pbio.1002334. ISSN 1545-7885. PMC 4699700. PMID 26727031.
{{cite journal}}
: CS1 maint: unflagged free DOI (link) - ^ Rennie, Drummond (1990). "Editorial Peer Review in Biomedical Publication". JAMA. 263 (10): 1317–1441. doi:10.1001/jama.1990.03440100011001. ISSN 0098-7484. PMID 2304208.
- ^ Harriman, Stephanie L.; Kowalczuk, Maria K.; Simera, Iveta; Wager, Elizabeth (2016). "A new forum for research on research integrity and peer review". Research Integrity and Peer Review. 1 (1): 5. doi:10.1186/s41073-016-0010-y. ISSN 2058-8615. PMC 5794038. PMID 29451544.
{{cite journal}}
: CS1 maint: unflagged free DOI (link) - ^ See link to webpage: https://link.springer.com/journal/11016
- ^ Markowitz, David M.; Hancock, Jeffrey T. (2016). "Linguistic obfuscation in fraudulent science". Journal of Language and Social Psychology. 35 (4): 435–445. doi:10.1177/0261927X15614605.
- ^ Ding, Y. (2010). "Applying weighted PageRank to author citation networks". Journal of the American Society for Information Science and Technology. 62 (2): 236–245. arXiv:1102.1760. doi:10.1002/asi.21452.
- ^ Zhu, X.; Turney, P.; Lemire, D.; Vellino, A. (2015). "Measuring academic influence: Not all citations are equal". Journal of the American Society for Information Science and Technology. 66 (2): 408–427. arXiv:1501.06587. doi:10.1002/asi.23179. hdl:2027.42/134425. Retrieved 2017-02-10.
External links
Journals
- Metascience
- Social Studies of Science
- Science & Technology Studies
- Technology in Society
- Research Policy
- Minerva: A Journal of Science, Learning and Policy
- Science Technology and Society
- Science and Public Policy
Further reading
- R. Harris, Rigor Mortis: How Sloppy Science Creates Worthless Cures, Crushes Hopes, and Wastes Billions, Basic Books, 2017 [1]
- ^ Harris, Richard (2017-03-09). Rigor Mortis. ISBN 9780465097913. Retrieved 2017-04-09.