Bonini's paradox

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Bonini's Paradox, named after Stanford business professor Charles Bonini, explains the difficulty in constructing models or simulations that fully capture the workings of complex systems (such as the human brain).[1]

Statements[edit]

In modern discourse, the paradox was articulated by John M. Dutton and William H. Starbuck[2] "As a model of a complex system becomes more complete, it becomes less understandable. Alternatively, as a model grows more realistic, it also becomes just as difficult to understand as the real-world processes it represents" (Computer Simulation of Human Behaviour, 1971).

This paradox may be used by researchers to explain why complete models of the human brain and thinking processes have not been created and will undoubtedly remain difficult for years to come.

This same paradox was observed earlier from a quote by Paul Valéry, "Everything simple is false. Everything which is complex is unusable." (Notre destin et les lettres, 1937)

Also, the same topic has been discussed by Richard Levins in his classic essay "The Strategy of Model Building in Population Biology", in stating that complex models have 'too many parameters to measure, leading to analytically insoluble equations that would exceed the capacity of our computers, but the results would have no meaning for us even if they could be solved. (See Orzack and Sober, 1993; Odenbaugh, 2006)

Related issues[edit]

Bonini's paradox can be seen as a case of the map–territory relation: simpler maps are less accurate though more useful representations of the territory. An extreme form is given in the fictional stories Sylvie and Bruno Concluded and On Exactitude in Science, which imagine a map of a scale of 1:1 (the same size as the territory), which is precise but unusable, illustrating one extreme of Bonini's paradox. Isaac Asimov's fictional science of "Psychohistory" in his Foundation series also faces with this dilemma; Asimov even had one of his psychohistorians discuss the paradox.

Selected bibliography[edit]

References[edit]

  1. ^ Charles P. Bonini (1963) Simulation of information and decision systems in the firm, Englewood Cliffs, N. J.: Prentice-Hall
  2. ^ W. H. Starbuck (1976) Organizations and their environments; In: M. D. Dunnette (ed.), Handbook of industrial and organizational psychology Chicago: Rand, p. 1069-1123