Jump to content

Evidence-based practice

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Citation bot (talk | contribs) at 22:55, 19 October 2022 (Alter: template type. Add: doi. Removed Template redirect. | Use this bot. Report bugs. | Suggested by AManWithNoPlan | #UCB_CommandLine). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Evidence-based practice (EBP) is the idea that occupational practices ought to be based on scientific evidence. While seemingly obviously desirable, the proposal has been controversial, with some arguing that results may not specialize to individuals as well as traditional practices.[1] Evidence-based practices have been gaining ground since the formal introduction of evidence-based medicine in 1992 and have spread to the allied health professions, education, management, law, public policy, architecture, and other fields.[2] In light of studies showing problems in scientific research (such as the replication crisis), there is also a movement to apply evidence-based practices in scientific research itself. Research into the evidence-based practice of science is called metascience.

The movement towards evidence-based practices attempts to encourage and, in some instances, require professionals and other decision-makers to pay more attention to evidence to inform their decision-making. The goal of evidence-based practice is to eliminate unsound or outdated practices in favor of more-effective ones by shifting the basis for decision making from tradition, intuition, and unsystematic experience to firmly grounded scientific research.[3]

History

For most of history, professions have based their practices on expertise derived from experience passed down in the form of tradition. Many of these practices have not been justified by evidence, which has sometimes enabled quackery and poor performance. Even when overt quackery is not present, quality and efficiency of tradition-based practices may not be optimal. As the scientific method has become increasingly recognized as a sound means to evaluate practices, evidence-based practices have become increasingly adopted.

One of the earliest proponents of EBP was Archie Cochrane, an epidemiologist who authored the book Effectiveness and Efficiency: Random Reflections on Health Services in 1972. Cochrane's book argued for the importance of properly testing health care strategies, and was foundational to the evidence-based practice of medicine.[4] Cochrane suggested that because resources would always be limited, they should be used to provide forms of health care which had been shown in properly designed evaluations to be effective. Cochrane maintained that the most reliable evidence was that which came from randomised controlled trials.[5]

The term "evidence-based medicine" was introduced by Gordon Guyatt in 1990 in an unpublished program description, and the term was later first published in 1992.[6][7][8] This marked the first evidence-based practice to be formally established. Some early experiments in evidence-based medicine involved testing primitive medical techniques such as bloodletting, and studying the effectiveness of modern and accepted treatments. There has been a push for evidence-based practices in medicine by insurance providers, which have sometimes refused coverage of practices lacking in systematic evidence of usefulness. It is now expected by most clients that medical professionals should make decisions based on evidence, and stay informed about the most up-to-date information. Since the widespread adoption of evidence-based practices in medicine, the use of evidence-based practices has rapidly spread to other fields.[9]

More recently, there has been a push for evidence-based education. The use of evidence-based learning techniques such as spaced repetition can improve students' rate of learning. Some commentators[who?] have suggested that the lack of any substantial progress in the field of education is attributable to practice resting in the unconnected and noncumulative experience of thousands of individual teachers, each re-inventing the wheel and failing to learn from hard scientific evidence about 'what works'. Opponents of this view argue that it is hard to assess teaching methods because it depends on a host of factors, not least those to do with the style, personality and beliefs of the teacher and the needs of the particular children.[10] Others argue the teacher experience could be combined with research evidence, but without the latter being treated as a privileged source.[11] This is inline with a school of thought suggesting that EBP has limitations and a better alternative is to use Evidence-informed Practice (EIP). This process includes quantitative evidence, does not include non-scientific prejudices, but includes qualitative factors such as clinical experience and the discernment of practitioners and clients.[12][13][14]

Versus tradition

Evidence-based practice is a philosophical approach that is in opposition to tradition. Some degree of reliance on "the way it was always done" can be found in almost every profession, even when those practices are contradicted by new and better information.[15]

Some critics argue that since research is conducted on a population level, results may not generalise to each individual within the population. Therefore, evidence-based practices may fail to provide the best solution to each individual, and traditional practices may better accommodate individual differences. In response, researchers have made an effort to test whether particular practices work better for different subcultures, personality types etc.[16] Some authors have redefined EBP to include practice that incorporates common wisdom, tradition, and personal values alongside practices based on evidence.[15]

Evaluating evidence

Evaluating scientific research is extremely complex. The process can by greatly simplified with the use of a heuristic that ranks the relative strengths of results obtained from scientific research called a hierarchy of evidence. The design of the study and the endpoints measured (such as survival or quality of life) affect the strength of the evidence. Typically, systematic reviews and meta-analysis rank at the top of the hierarchy while randomized controlled trials rank above observational studies, and expert opinion and case reports rank at the bottom. There is broad agreement on the relative strength of the different types of studies, but there is no single, universally-accepted hierarchy of evidence. More than 80 different hierarchies have been proposed for assessing medical evidence.[17]

Applications

Medicine

Evidence-based medicine (EBM) is an approach to medical practice intended to optimize decision-making by emphasizing the use of evidence from well-designed and well-conducted research. Although all medicine based on science has some degree of empirical support, EBM goes further, classifying evidence by its epistemologic strength and requiring that only the strongest types (coming from meta-analyses, systematic reviews, and randomized controlled trials) can yield strong recommendations; weaker types (such as from case-control studies) can yield only weak recommendations. The term was originally used to describe an approach to teaching the practice of medicine and improving decisions by individual physicians about individual patients.[18] Use of the term rapidly expanded to include a previously described approach that emphasized the use of evidence in the design of guidelines and policies that apply to groups of patients and populations ("evidence-based practice policies").[19]

Whether applied to medical education, decisions about individuals, guidelines and policies applied to populations, or administration of health services in general, evidence-based medicine advocates that to the greatest extent possible, decisions and policies should be based on evidence, not just the beliefs of practitioners, experts, or administrators. It thus tries to assure that a clinician's opinion, which may be limited by knowledge gaps or biases, is supplemented with all available knowledge from the scientific literature so that best practice can be determined and applied. It promotes the use of formal, explicit methods to analyze evidence and makes it available to decision makers. It promotes programs to teach the methods to medical students, practitioners, and policymakers.

A process has been specified that provides a standardised route for those seeking to produce evidence of the effectiveness of interventions.[20] Originally developed to establish processes for the production of evidence in the housing sector, the standard is general in nature and is applicable across a variety of practice areas and potential outcomes of interest.

Mental health

To improve dissemination of evidence-based practices, the Association for Behavioral and Cognitive Therapies (ABCT) and the Society of Clinical Child and Adolescent Psychology (SCCAP, Division 53 of the American Psychological Association)[21] maintain updated information on their websites on evidence-based practices in psychology for practitioners and the general public. An evidence-based practice consensus statement was developed at a summit on mental healthcare in 2018. As of June 23, 2019, this statement has been endorsed by 36 organizations.

Metascience

There has since been a movement for the use of evidence-based practice in conducting scientific research in attempt to address the replication crisis and other major issues affecting scientific research.[22] The application of evidence-based practices to research itself is called metascience, which seeks to increase the quality of scientific research while reducing waste. It is also known as "research on research" and "the science of science", as it uses research methods to study how research is done and where improvements can be made. The five main areas of research in metascience are methodology, reporting, reproducibility, evaluation, and incentives.[23] Metascience has produced a number of reforms in science such as the use of study pre-registration and the implementation of reporting guidelines with the goal of bettering scientific research practices.[24]

Education

Evidence-based education (EBE), also known as evidence-based interventions, is a model in which policy-makers and educators use empirical evidence to make informed decisions about education interventions (policies, practices, and programs).[25] In other words, decisions are based on scientific evidence rather than opinion.

EBE has gained attention since English author David H. Hargreaves suggested in 1996 that education would be more effective if teaching, like medicine, was a "research-based profession".[26]

Since 2000, studies in Australia, England, Scotland and the USA have supported the use of research to improve educational practices in teaching reading.[27][28][29]

In 1997, the National Institute of Child Health and Human Development (NICHD) convened a national panel to assess the effectiveness of different approaches used to teach children to read. The resulting National Reading Panel examined quantitative research studies on many areas of reading instruction, including phonics and whole language. In 2000 it published a report entitled Teaching Children to Read: An Evidence-based Assessment of the Scientific Research Literature on Reading and its Implications for Reading Instruction that provided a comprehensive review of what was known about best practices in reading instruction in the U.S.[30][31][32]

This occurred around the same time as such international studies as the Programme for International Student Assessment (PISA) in 2000 and the Progress in International Reading Literacy Study (PIRLS) in 2001.

Subsequently, evidence-based practice (EBP) in education (also known as Scientifically based research), came into prominence in the U.S.A. under the No child left behind act of 2001, replace in 2015 by the Every Student Succeeds Act (ESSA).

In 2002 the U.S. Department of Education founded the Institute of Education Sciences (IES) to provide scientific evidence to guide education practice and policy .

English author Ben Goldacre advocated in 2013 for systemic change and more randomized controlled trials to assess the effects of educational interventions.[33] In 2014 the National Foundation for Educational Research, Berkshire, England[34] published a report entitled Using Evidence in the Classroom: What Works and Why.[35] In 2014 the British Educational Research Association (BERA) and the Royal Society of Arts (RSA) advocated for a closer working partnership between teacher-researchers and the wider academic research community.[36][37]

Reviews of existing research on education

The following websites offer free analysis and information on education research:

  • The Best Evidence Encyclopedia (BEE)[38] is a free website created by the Johns Hopkins University School of Education's Center for Data-Driven Reform in Education (established in 2004) and is funded by the Institute of Education Sciences, U.S. Department of Education. It gives educators and researchers reviews about the strength of the evidence supporting a variety of English programs available for students in grades K–12. The reviews cover programs in areas such as Mathematics, Reading, Writing, Science, Comprehensive school reform, and Early childhood Education; and includes such topics as effectiveness of technology and struggling readers.
  • The Education Endowment Foundation was established in 2011 by The Sutton Trust, as a lead charity in partnership with Impetus Trust, together being the government-designated What Works Centre for UK Education.[39]
  • Evidence for ESSA[40] began in 2017 and is produced by the Center for Research and Reform in Education (CRRE)[41] at Johns Hopkins University School of Education. It offers free up-to-date information on current PK-12 programs in reading, writing, math, science, and others that meet the standards of the Every Student Succeeds Act (ESSA) (the United States K–12 public education policy signed by President Obama in 2015)..[42] It also provides information on programs that do meet ESSA standards as well as those that do not.
  • What Works Clearinghouse (WWC),[43] established in 2002, evaluates numerous educational programs, in twelve categories, by the quality and quantity of the evidence, and the effectiveness. It is operated by the federal National Center for Education Evaluation, and Regional Assistance (NCEE), part of Institute of Education Sciences (IES)[43]
  • Social programs that work is administered by the Arnold Ventures LLC's Evidence-Based Policy team. The team is composed of the former leadership of the Coalition for Evidence-Based Policy, a nonprofit, nonpartisan organization advocating the use of well-conducted randomized controlled trials (RCTs) in policy decisions.[44] It offers information on twelve types of social programs including education.

A variety of other organizations offer information on research and education.

See also

References

  1. ^ For example: Trinder, L. and Reynolds, S. (eds) (2000) Evidence-Based Practice: A Critical Appraisal. Oxford, Blackwell Science.
  2. ^ Li, Rita Yi Man; Chau, Kwong Wing; Zeng, Frankie Fanjie (2019). "Ranking of Risks for Existing and New Building Works". Sustainability. 11 (10): 2863. doi:10.3390/su11102863.
  3. ^ Leach, Matthew J. (2006). "Evidence-based practice: A framework for clinical practice and research design". International Journal of Nursing Practice. 12 (5): 248–251. doi:10.1111/j.1440-172X.2006.00587.x. ISSN 1440-172X. PMID 16942511. S2CID 37311515.
  4. ^ Cochrane, A.L. (1972). Effectiveness and Efficiency. Random Reflections on Health Services. London: Nuffield Provincial Hospitals Trust. ISBN 978-0900574177. OCLC 741462.
  5. ^ Cochrane Collaboration (2003) http://www.cochrane.org/about-us/history/archie-cochrane Archived 2021-02-24 at the Wayback Machine
  6. ^ "Development of evidence-based medicine explored in oral history video". American Medical Association. Retrieved 2020-12-23.
  7. ^ Sackett, D L; Rosenberg, W M (November 1995). "The need for evidence-based medicine". Journal of the Royal Society of Medicine. 88 (11): 620–624. doi:10.1177/014107689508801105. ISSN 0141-0768. PMC 1295384. PMID 8544145.
  8. ^ Evidence-Based Medicine Working Group (1992-11-04). "Evidence-based medicine. A new approach to teaching the practice of medicine". JAMA. 268 (17): 2420–2425. doi:10.1001/jama.1992.03490170092032. ISSN 0098-7484. PMID 1404801.
  9. ^ "A Brief History of Evidence-based Practice". Evidence Based Practice in Optometry. University of New South Wales. Retrieved 24 June 2019.
  10. ^ Hammersley, M. (2013) The Myth of Research-Based Policy and Practice. London: Sage.
  11. ^ Thomas, G. and Pring, R. (Eds.) (2004). Evidence-based Practice in Education. Open University Press.
  12. ^ Nevo, Isaac; Slonim-Nevo, Vered (September 1, 2011). "The Myth of Evidence-Based Practice: Towards Evidence-Informed Practice". The British Journal of Social Work. 41 (6): 1176–1197. doi:10.1093/bjsw/bcq149 – via Silverchair.
  13. ^ "Working in Health Promoting Ways". Tasmanian Department of Health.
  14. ^ "Evidence-based Practice vs. Evidence-informed Practice, M. Gail Woodbury and Janet L. Kuhnke, Queen's University, ON, April 2014".
  15. ^ a b Buysse, V.; Wesley, P.W. (2006). "Evidence-based practice: How did it emerge and what does it really mean for the early childhood field?". Zero to Three. 27 (2): 50–55. ISSN 0736-8038.
  16. ^ de Groot, M.; van der Wouden, J. M.; van Hell, E. A.; Nieweg, M. B. (31 July 2013). "Evidence-based practice for individuals or groups: let's make a difference". Perspectives on Medical Education. 2 (4): 216–221. doi:10.1007/s40037-013-0071-2. PMC 3792230. PMID 24101580.
  17. ^ Siegfried T (2017-11-13). "Philosophical critique exposes flaws in medical evidence hierarchies". Science News. Retrieved 2018-05-16.
  18. ^ Evidence-Based Medicine Working Group (November 1992). "Evidence-based medicine. A new approach to teaching the practice of medicine". JAMA. 268 (17): 2420–25. CiteSeerX 10.1.1.684.3783. doi:10.1001/JAMA.1992.03490170092032. PMID 1404801.
  19. ^ Eddy DM (1990). "Practice Policies – Where Do They Come from?". Journal of the American Medical Association. 263 (9): 1265, 1269, 1272, 1275. doi:10.1001/jama.263.9.1265. PMID 2304243.
  20. ^ Vine, Jim (2016), Standard for Producing Evidence – Effectiveness of Interventions – Part 1: Specification (StEv2-1), HACT, Standards of Evidence, ISBN 978-1-911056-01-0
  21. ^ "SCCAP Division 53 – The Society for Child Clinical and Adolescent Psychology".
  22. ^ Rathi, Akshat (22 October 2015). "Most science research findings are false. Here's how we can change that". Quartz. Retrieved 13 June 2019.
  23. ^ Ioannidis, John P. A.; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N. (2 October 2015). "Meta-research: Evaluation and Improvement of Research Methods and Practices". PLOS Biology. 13 (10): e1002264. doi:10.1371/journal.pbio.1002264. ISSN 1544-9173. PMC 4592065. PMID 26431313.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  24. ^ Ioannidis, John P. A.; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N. (2015-10-02). "Meta-research: Evaluation and Improvement of Research Methods and Practices". PLOS Biology. 13 (10): –1002264. doi:10.1371/journal.pbio.1002264. ISSN 1545-7885. PMC 4592065. PMID 26431313.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  25. ^ Trinder, L. and Reynolds, S. (eds) (2000) Evidence-Based Practice: A critical appraisal, Oxford, Blackwell Science.
  26. ^ "Teaching as a research-based profession, David H. Hargreaves, 1996, researchgate.net".
  27. ^ "Teaching Reading" (PDF). Australian Government Department of Education, Science and Training.
  28. ^ "Independent review of the teaching of early reading, 2006" (PDF). Archived (PDF) from the original on 2010-05-12. Retrieved 2020-07-31.
  29. ^ Johnston, Rhona S; Watson, Joyce E, Insight 17 - A seven year study of the effects of synthetic phonics teaching on reading and spelling attainment, IAC:ASU Schools, ISSN 1478-6796, archived from the original on 2017-01-14
  30. ^ "National Reading Panel (NRP) – Publications and Materials – Summary Report". National Institute of Child Health and Human Development. (2000). Report of the National Reading Panel. Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction (NIH Publication No. 00-4769). Washington, DC: U.S. Government Printing Office. 2000. Archived from the original on 2010-06-10.
  31. ^ "National Reading Panel (NRP) – Publications and Materials – Reports of the Subgroups". National Institute of Child Health and Human Development. (2000). Report of the National Reading Panel. Teaching children to read: an evidence-based assessment of the scientific research literature on reading and its implications for reading instruction: Reports of the subgroups (NIH Publication No. 00-4754). Washington, DC: U.S. Government Printing Office. 2000. Archived from the original on 2010-06-11.
  32. ^ "Teacher's Guide, Put Reading First - K-3, NICHD, edpubs@inet.ed.gov" (PDF).
  33. ^ "Building Evidence into Education". GOV.UK.
  34. ^ "NFER Home page".
  35. ^ "Using Evidence in the Classroom: What Works and Why, Nelson, J. and O'Beirne, C. (2014). Slough: NFER. ISBN 978-1-910008-07-2" (PDF).
  36. ^ "The role of research in teacher education: reviewing the evidence-BERA-RSA, January 2014" (PDF).
  37. ^ "Research and Teacher Education". www.bera.ac.uk.
  38. ^ "Best Evidence Encyclopedia". Best Evidence Encyclopedia.
  39. ^ "Education Endowment Foundation, UK".
  40. ^ "Evidence for ESSA".
  41. ^ "Center for Research and Reform in Education".
  42. ^ "Every Student Succeeds Act (ESSA) | U.S. Department of Education". www.ed.gov.
  43. ^ a b "WWC | Find What Works!". ies.ed.gov.
  44. ^ http://toptierevidence.org/ Social programs that work