Relevance paradox

From Wikipedia, the free encyclopedia
Jump to: navigation, search

The relevance paradox describes an attempt to gather information relevant to a decision, which fails because the elimination of information perceived as distracting or unnecessary and thus detrimental to making an optimal decision, can also inadvertently exclude information that is actually crucial.[1]

Definition[edit]

In many cases in which action or decision is required, it is obvious what information relevant to the matter at hand may be lacking: a military attack may not have maps so reconnaissance is undertaken, an engineering project may not have ground condition details, and these will be ascertained, a public health program will require a survey of which illnesses are prevalent, and so on.

However, in many significant instances across a wide range of areas, even when relevant information is readily available, the decision makers are not aware of its relevance because they lack the information which would make its relevance clear. As a result, they do not attempt to look for it.[2] These decision makers will seek only the information and advice they believe is the minimum amount required as opposed to what they actually need to fully meet their goals.

An analogy would be that of a near-sighted person who is unaware of their condition. They would not perceive any benefit in getting the glasses they need to improve their sight, until they tried wearing the glasses. Such a situation could be resolved by a third party, aware of its relevance, recommending an eye test.

The Relevance Paradox is cited as a cause of the increase in diseases in developing countries even while more money is being spent on them: "Relevance paradoxes occur because of implementation of projects without awareness of the social or individual tacit knowledge within a target community. The understanding of the individual and the social tacit knowledge in a given community, which is a function of knowledge emergence, is the foundation of effectiveness in leadership practice." [3]

Examples[edit]

Civil engineers, from the 1950s onwards, unwittingly caused a massive increase in the debilitating water borne infection schistosomiasis (bilharzia) as a result of irrigation schemes that lacked simple low-cost countermeasures built in, simply because they had no knowledge of these countermeasures. Yet at the same time, the United Nations had already published guidelines explaining cheap countermeasures and how they could be built into the design of the irrigation schemes: matters as simple as keeping velocities above a certain level to prevent the disease vector (a water snail) from attaching to the conduits. The civil engineers thought they only needed to know about engineering issues such as concrete and water flows, not how to control flow velocities to prevent the snail species that carried the disease from multiplying, so they failed to seek this information.[4]

The relevance paradox can and usually does apply to all professional groups and individuals in numerous ways.[5] While there are many examples of wilful ignorance, there are also many cases where people do not look outside the paradigms they are operating in and thus fail to see the long term consequences.

Avoidance[edit]

The notions of Information Routing Groups (IRGs) and Interlock research were designed to counter this paradox by the promotion of lateral communication and the flow of tacit knowledge, which, in many cases, consists of the unconscious and unwritten knowledge of what is relevant.

A related point is that in many cases, despite the existence of good library indexing systems and search engines, the way specific knowledge may be described is not obvious unless one already has the knowledge.

See also[edit]

References[edit]

  1. ^ http://www.claverton-energy.com/?dl_id=339 The IRG Solution, Chapter 5 page 87
  2. ^ "The Importance of Knowing the Right People". The Guardian. 1980-03-20. 
  3. ^ http://ustawi.com/main/page_about_ustawi.html
  4. ^ Charnock, Anne (1980-08-07). "Taking Bilharziasis out of the irrigation equation". New Civil Engineer 1 (8). 
  5. ^ Andrews, David (1984). The IRG Solution - Hierarchical Incompetence and how to overcome it. London: Souvenir Press. pp. 200–220. ISBN 0-285-62662-0.