Jump to content

Empiric therapy

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by ActivelyDisinterested (talk | contribs) at 14:55, 22 February 2023 (References: Resolving Category:Harv and Sfn no-target errors. Imported cite). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Empiric therapy or empirical therapy is medical treatment or therapy based on experience[1] and, more specifically, therapy begun on the basis of a clinical "educated guess" in the absence of complete or perfect information. Thus it is applied before the confirmation of a definitive medical diagnosis or without complete understanding of an etiology, whether the biological mechanism of pathogenesis or the therapeutic mechanism of action. The name shares the same stem with empirical evidence, involving an idea of practical experience.

Empiric antimicrobial therapy is directed against an anticipated and likely cause of infectious disease. It is used when antimicrobials are given to a person before the specific bacterium or fungus causing an infection is known. When it becomes known, treatment that is used is called directed therapy. Fighting an infection sooner rather than later is important to minimize morbidity, risk, and complications for serious infections like sepsis and suspected bacterial meningitis.

Empiric antimicrobial therapy

Empiric antimicrobial therapy is typically broad-spectrum, in that it treats both a multitude of either Gram-positive and/or Gram-negative bacteria, diverse fungi or parasites respectively. When more information is known (as from a blood culture), treatment may be changed to a narrow-spectrum antimicrobial which more specifically targets the bacterium or fungus known to be causing disease. Empiric antimicrobial therapy is a fairly sophisticated process which includes considering data such as a person's age, immune status, comorbidities, likelihood for a certain microbial etiology and pre-test probability for antimicrobial resistance prior to therapy, risk of bad outcomes, and to name a few.

Specimens are collected from affected body sites, preferably before antibiotics are given. For example, a person in an intensive care unit may develop a hospital-acquired pneumonia. There is a chance the causal bacteria, or its sensitivity to antibiotics, may be different to community-acquired pneumonia.[2] Treatment is generally started empirically, on the basis of surveillance data about the local common bacterial causes. This first treatment, based on statistical information about former patients, and aimed at a large group of potentially involved microbes, is called empiric treatment.[3]

The advantage of indicating antibiotics empirically exists where a causative pathogen is likely albeit unknown and where diagnostic tests will not be influential to treatment. In this case, there may be little if any perceived benefit of using what may be costly and inconclusive tests that will only delay treatment of the same antibiotics. The empirical use of broad-spectrum antibiotics increases, by selection, the prevalence of bacteria resistant to several antibiotics. However, the delay and expense that would be required to perform definitive species identification in every single clinical case are not affordable, so some degree of trade-off is accepted on the principle of the benefits outweighing the risk.

Earlier senses of the term

Another now-dated sense of the term empiric therapy involves quackery, and empiric as a noun has been used as a synonym of quack.[4]

This sense applies when the amount of guessing involved by the clinician transcends so far beyond science that the standard of care is not upheld. Whereas prescribing a broad-spectrum antibiotic to fight a clinically apparent infection as early as possible is entirely prudent and scientific despite the absence of confirmatory cultures, prescribing magic rituals or pseudoscientific schemes is not scientific.

The fact that "acting on practical experience in the absence of theory or complete knowledge" can have both legitimate and illegitimate forms stretches back to long before science existed. For example, in the era of ancient Greece, when medical science as we now know it did not yet exist, all medicine was unscientific and traditional; theories of etiology, pathogenetic mechanism, and therapeutic mechanism of action were based on religious, mythologic, or cosmologic ideas. For example, humorism could dictate that bloodletting was indicated for a certain disorder because a supposed excess of water could be rebalanced. However, because such theories involved a great deal of fanciful notions, their safety and efficacy could be slim to negative. In the example of bloodletting to correct excess water, the fact that fluid balance is a legitimate physiologic concern didn't mean that the then-state-of-the-art "understanding" of causation was well founded overall. In this environment where mainstream medicine was unscientific, a school of thought arose in which theory would be ignored and only practical results would be considered. This was the original introduction of empiricism into medicine, long before medical science would greatly extend it.

However, by the late 19th and early 20th centuries, as biological and medical science developed, the situation had reversed: because the state of the art in medicine was now scientific medicine, those physicians who ignored all etiologic theory in favor of only their own experience were now increasingly quackish, even though in the era of religion-based or mythology-based medicine (the era of medicine men) they might have been, as viewed through today's hindsight, admirably rational and in fact protoscientific. Thus as science became the norm, unscientific and pseudoscientific approaches qualified as quackery.

In the 21st century, the next phase of differentiation on this topic is underway. All clinical practice based on medical science is (by that fact) based on empirical evidence to a large degree, but efforts are underway to make sure that all of the science on any given medical topic is consistently applied in the clinic, with the best portions of it graded and weighted more heavily. This is the latest cycle in which personal experience (even expert opinion with scientific basis) is not considered good enough by itself. Thus, in evidence-based medicine, the goal is that every clinician will make decisions for every patient with total mastery and critical analysis of the entire scientific literature at their fingertips. This is a formidably vast goal to implement operationally (because it is not even possible for one person to master all extant biomedical knowledge on the basis of individual education[5]), but development of health information technology such as expert systems and other artificial intelligence in medicine is underway in pursuit of it.[5]

See also

References

  1. ^ Elsevier, Dorland's Illustrated Medical Dictionary, Elsevier.
  2. ^ Peyrani P, Mandell L, Torres A, Tillotson GS (February 2019). "The burden of community-acquired bacterial pneumonia in the era of antibiotic resistance". Expert Review of Respiratory Medicine. 13 (2): 139–152. doi:10.1080/17476348.2019.1562339. PMID 30596308. S2CID 58640721.
  3. ^ Burnett 2005, pp. 167–171.
  4. ^ Merriam-Webster, Merriam-Webster's Collegiate Dictionary, Merriam-Webster.
  5. ^ a b Khosla, Vinod (2012-12-04), "Technology will replace 80% of what doctors do", Fortune, Most doctors couldn't possibly read and digest all of the latest 5,000 research articles on heart disease. And, most of the average doctor's medical knowledge is from when they were in medical school, while cognitive limitations prevent them from remembering the 10,000+ diseases humans can get.

Works cited