Modularity of mind

From Wikipedia, the free encyclopedia
(Redirected from Modularity of Mind)

Modularity of mind is the notion that a mind may, at least in part, be composed of innate neural structures or mental modules which have distinct, established, and evolutionarily developed functions. However, different definitions of "module" have been proposed by different authors. According to Jerry Fodor, the author of Modularity of Mind, a system can be considered 'modular' if its functions are made of multiple dimensions or units to some degree.[1] One example of modularity in the mind is binding. When one perceives an object, they take in not only the features of an object, but the integrated features that can operate in sync or independently that create a whole. Instead of just seeing red, round, plastic, and moving, the subject may experience a rolling red ball.[2] Binding may suggest that the mind is modular because it takes multiple cognitive processes to perceive one thing.

Early investigations[edit]

Historically, questions regarding the functional architecture of the mind have been divided into two different theories of the nature of the faculties. The first can be characterized as a horizontal view because it refers to mental processes as if they are interactions between faculties such as memory, imagination, judgement, and perception, which are not domain specific (e.g., a judgement remains a judgement whether it refers to a perceptual experience or to the conceptualization/comprehension process). The second can be characterized as a vertical view because it claims that the mental faculties are differentiated on the basis of domain specificity, are genetically determined, are associated with distinct neurological structures, and are computationally autonomous.[3]

The vertical vision goes back to the 19th-century movement called phrenology and its founder Franz Joseph Gall. Gall claimed that the individual mental faculties could be associated precisely, in a one-to-one correspondence, with specific physical areas of the brain.[4] For example, someone's level of intelligence could be literally "read off" from the size of a particular bump on his posterior parietal lobe. Phrenology's practice was debunked scientifically by Pierre Flourens in the 19th century. He destroyed parts of pigeons' and dogs' brains, called lesions, and studied the organisms' resulting dysfunction. He was able to conclude that while the brain localizes in some functions, it also works as a unit and is not as localized as earlier phrenologists thought.[4] Before the early 20th century, Edward Bradford Titchener studied the modules of the mind through introspection. He tried to determine the original, raw perspective experiences of his subjects. For example, if he wanted his subjects to perceive an apple, they would need to talk about spatial characteristics of the apple and the different hues that they saw without mentioning the apple.[4]

Fodor's Modularity of Mind[edit]

In the 1980s, however, Jerry Fodor revived the idea of the modularity of mind, although without the notion of precise physical localizability. Drawing from Noam Chomsky's idea of the language acquisition device and other work in linguistics as well as from the philosophy of mind and the implications of optical illusions, he became a major proponent of the idea with the 1983 publication of Modularity of Mind.[3]

According to Fodor, a module falls somewhere between the behaviorist and cognitivist views of lower-level processes.

Behaviorists tried to replace the mind with reflexes, which are, according to Fodor, encapsulated (cognitively impenetrable or unaffected by other cognitive domains) and non-inferential (straight pathways with no information added). Low-level processes are unlike reflexes in that they can be inferential. This can be demonstrated by poverty of the stimulus argument, which posits that children do not only learn language from their environment, but are innately programmed with low-level processes that help them seek and learn language. The proximate stimulus, that which is initially received by the brain (such as the 2D image received by the retina), cannot account for the resulting output (for example, our 3D perception of the world), thus necessitating some form of computation.[5]

In contrast, cognitivists saw lower-level processes as continuous with higher-level processes, being inferential and cognitively penetrable (influenced by other cognitive domains, such as beliefs). The latter has been shown to be untrue in some cases, such as the Müller-Lyer illusion, which can persist despite a person's awareness of their existence.[6] This is taken to indicate that other domains, including one's beliefs, cannot influence such processes.

Fodor arrives at the conclusion that such processes are inferential like higher-order processes and encapsulated in the same sense as reflexes.

Although he argued for the modularity of "lower level" cognitive processes in Modularity of Mind he also argued that higher-level cognitive processes are not modular since they have dissimilar properties. The Mind Doesn't Work That Way, a reaction to Steven Pinker's How the Mind Works, is devoted to this subject.

Fodor (1983) states that modular systems must—at least to "some interesting extent"—fulfill certain properties:

  1. Domain specificity: modules only operate on certain kinds of inputs—they are specialised
  2. Obligatory firing: modules process in a mandatory manner
  3. Limited accessibility: what central processing can access from input system representations is limited
  4. Fast speed: probably due to the fact that they are encapsulated (thereby needing only to consult a restricted database) and mandatory (time need not be wasted in determining whether or not to process incoming input)
  5. Informational encapsulation: modules need not refer to other psychological systems in order to operate
  6. Shallow outputs: the output of modules is very simple
  7. Specific breakdown patterns
  8. Characteristic ontogeny: there is a regularity of development
  9. Fixed neural architecture.

Pylyshyn (1999) has argued that while these properties tend to occur with modules, one—information encapsulation—stands out as being the real signature of a module; that is the encapsulation of the processes inside the module from both cognitive influence and from cognitive access.[7] One example is that conscious awareness that the Müller-Lyer illusion is an illusion does not correct visual processing.[8]

Evolutionary psychology and massive modularity[edit]

The definition of module has caused confusion and dispute. In J.A. Fodor's views, modules can be found in peripheral and low-level visual processing, but not in central processing. Later, he narrowed the two essential features to domain-specificity and information encapsulation. According to Frankenhuis and Ploeger, domain-specificity means that "a given cognitive mechanism accepts, or is specialized to operate on, only a specific class of information".[8] Information encapsulation means that information processing in the module cannot be affected by information in the rest of the brain. One example is that the effects of an optical illusion, created by low-level processes, persist despite high-level processing caused by conscious awareness of the illusion itself.[8]

Other perspectives on modularity come from evolutionary psychology. Evolutionary psychologists propose that the mind is made up of genetically influenced and domain-specific[9] mental algorithms or computational modules, designed to solve specific evolutionary problems of the past.[10] Modules are also used for central processing. This theory is sometimes referred to as massive modularity.[8] Leda Cosmides and John Tooby claimed that modules are units of mental processing that evolved in response to selection pressures. To them, each module was a complex computer that innately processed distinct parts of the world, like facial recognition, recognizing human emotions, and problem-solving.[11] On this view, much modern human psychological activity is rooted in adaptations that occurred earlier in human evolution, when natural selection was forming the modern human species.

A 2010 review by evolutionary psychologists Confer et al. suggested that domain general theories, such as for "rationality", has several problems: 1. Evolutionary theories using the idea of numerous domain-specific adaptions have produced testable predictions that have been empirically confirmed; the theory of domain-general rational thought has produced no such predictions or confirmations. 2. The rapidity of responses such as jealousy due to infidelity indicates a domain-specific dedicated module rather than a general, deliberate, rational calculation of consequences. 3. Reactions may occur instinctively (consistent with innate knowledge) even if a person has not learned such knowledge. One example being that in the ancestral environment it is unlikely that males during development learn that infidelity (usually secret) may cause paternal uncertainty (from observing the phenotypes of children born many months later and making a statistical conclusion from the phenotype dissimilarity to the cuckolded fathers).[12] With respect to general purpose problem solvers, Barkow, Cosmides, and Tooby (1992) have suggested in The Adapted Mind: Evolutionary Psychology and The Generation of Culture that a purely general problem solving mechanism is impossible to build due to the frame problem. Clune et al. (2013) have argued that computer simulations of the evolution of neural nets suggest that modularity evolves because, compared to non-modular networks, connection costs are lower.[13]

Several groups of critics, including psychologists working within evolutionary frameworks,[14] argue that the massively modular theory of mind does little to explain adaptive psychological traits. Proponents of other models of the mind argue that the computational theory of mind is no better at explaining human behavior than a theory with mind entirely a product of the environment. Even within evolutionary psychology there is discussion about the degree of modularity, either as a few generalist modules or as many highly specific modules.[14][15] Other critics suggest that there is little empirical support in favor of the domain-specific theory beyond performance on the Wason selection task, a task critics state is too limited in scope to test all relevant aspects of reasoning.[16][17] Moreover, critics argue that Cosmides and Tooby's conclusions contain several inferential errors and that the authors use untested evolutionary assumptions to eliminate rival reasoning theories.[16][18]

Criticisms of the notion of modular minds from genetics include that it would take too much genetic information to form innate modularity of mind, the limits to the possible amount of functional genetic information being imposed by the number of mutations per generation that led to the prediction that only a small part of the human genome can be functional in an information-carrying way if an impossibly high rate of lethal mutations is to be avoided, and that selection against lethal mutations would have stopped and reversed any increase in the amount of functional DNA long before it reached the amount that would be required for modularity of mind. It is argued that proponents of the theory of mind conflate this with the straw man argument of assuming no function in any non-protein-coding DNA when pointing at discoveries of some parts of non-coding DNA having regulatory functions, while the actual argument of limited amount of functional DNA does acknowledge that some parts of non-coding DNA can have functions but putting bounds on the total amount of information-bearing genetic material regardless of whether or not it codes for proteins, in agreement with the discoveries of regulatory functions of non-coding DNA extending only to parts of it and not be generalized to all DNA that does not code for proteins. The maximum amount of information-carrying heredity is argued to be too small to form modular brains.[19]

Wallace (2010) observes that the evolutionary psychologists' definition of "mind" has been heavily influenced by cognitivism and/or information processing definitions of the mind.[20] Critics point out that these assumptions underlying evolutionary psychologists' hypotheses are controversial and have been contested by some psychologists, philosophers, and neuroscientists. For example, Jaak Panksepp, an affective neuroscientist, point to the "remarkable degree of neocortical plasticity within the human brain, especially during development" and states that "the developmental interactions among ancient special-purpose circuits and more recent general-purpose brain mechanisms can generate many of the "modularized" human abilities that evolutionary psychology has entertained."[14]

Philosopher David Buller agrees with the general argument that the human mind has evolved over time but disagrees with the specific claims evolutionary psychologists make. He has argued that the contention that the mind consists of thousands of modules, including sexually dimorphic jealousy and parental investment modules, are unsupported by the available empirical evidence.[21] He has suggested that the "modules" result from the brain's developmental plasticity and that they are adaptive responses to local conditions, not past evolutionary environments.[22] However, Buller has also stated that even if massive modularity is false this does not necessarily have broad implications for evolutionary psychology. Evolution may create innate motives even without innate knowledge.[23]

In contrast to modular mental structure, some theories posit domain-general processing, in which mental activity is distributed across the brain and cannot be decomposed, even abstractly, into independent units. A staunch defender of this view is William Uttal, who argues in The New Phrenology (2003) that there are serious philosophical, theoretical, and methodological problems with the entire enterprise of trying to localise cognitive processes in the brain.[24] Part of this argument is that a successful taxonomy of mental processes has yet to be developed.

Merlin Donald argues that over evolutionary time the mind has gained adaptive advantage from being a general problem solver.[25] The mind, as described by Donald, includes module-like "central" mechanisms, in addition to more recently evolved "domain-general" mechanisms.

See also[edit]

References[edit]

  1. ^ Robbins, Philip (August 21, 2017). "Modularity of Mind". Standard Encyclopedia of Philosophy.
  2. ^ Goldstein, E. Bruce (17 June 2014). Cognitive Psychology. p. 109. ISBN 978-1-285-76388-0.
  3. ^ a b Fodor, Jerry A. (1983). Modularity of Mind: An Essay on Faculty Psychology. Cambridge, Massachusetts: MIT Press. ISBN 0-262-56025-9
  4. ^ a b c Hergenhahn, B. R., 1934–2007. (2009). An introduction to the history of psychology (6th ed.). Australia: Wadsworth Cengage Learning. ISBN 978-0-495-50621-8. OCLC 234363300.{{cite book}}: CS1 maint: multiple names: authors list (link) CS1 maint: numeric names: authors list (link)
  5. ^ Laurence, Stephen (2001). "The Poverty of the Stimulus Argument". The British Journal for the Philosophy of Science. 52 (2): 217–276. doi:10.1093/bjps/52.2.217.
  6. ^ Donaldson, J (2017). "Muller Lyer". The Illusions Index.
  7. ^ Pylyshyn, Z.W. (1999). "Is vision continuous with cognition? The case for cognitive impenetrability of visual perception" (PDF). Behavioral and Brain Sciences. 22 (3): 341–423. doi:10.1017/S0140525X99002022. PMID 11301517. S2CID 9482993. Archived from the original (PDF) on 2008-05-11.
  8. ^ a b c d Frankenhuis, W. E.; Ploeger, A. (2007). "Evolutionary Psychology Versus Fodor: Arguments for and Against the Massive Modularity Hypothesis". Philosophical Psychology. 20 (6): 687. doi:10.1080/09515080701665904. S2CID 96445244.
  9. ^ Cosmides, L. & Tooby, J. (1994). Origins of Domain Specificity: The Evolution of Functional Organization. In L.A. Hirschfeld and S.A. Gelmen, eds., Mapping the Mind: Domain Specificity in Cognition and Culture. Cambridge: Cambridge University Press. Reprinted in R. Cummins and D.D. Cummins, eds., Minds, Brains, and Computers. Oxford: Blackwell, 2000, 523–543.
  10. ^ Cosmides, L., & Tooby, J. (1992). Cognitive Adaptations for Social Exchange. In Barkow, Cosmides, and Tooby 1992, 163–228.
  11. ^ Samuels, Richard (1998). "Evolutionary Psychology and the Massive Modularity Hypothesis". The British Journal for the Philosophy of Science. 49 (4): 575–602. doi:10.1093/bjps/49.4.575. JSTOR 688132.
  12. ^ Confer, J. C.; Easton, J. A.; Fleischman, D. S.; Goetz, C. D.; Lewis, D. M. G.; Perilloux, C.; Buss, D. M. (2010). "Evolutionary psychology: Controversies, questions, prospects, and limitations" (PDF). American Psychologist. 65 (2): 110–126. CiteSeerX 10.1.1.601.8691. doi:10.1037/a0018413. PMID 20141266.
  13. ^ Clune, Jeff; Mouret, Jean-Baptiste; Lipson, Hod (2013). "The evolutionary origins of modularity". Proceedings of the Royal Society. 280 (1755): 20122863. arXiv:1207.2743. doi:10.1098/rspb.2012.2863. PMC 3574393. PMID 23363632.
  14. ^ a b c Panksepp, J. & Panksepp, J. (2000). The Seven Sins of Evolutionary Psychology. Evolution and Cognition, 6:2, 108–131.
  15. ^ Buller, David J. and Valerie Gray Hardcastle (2005) Chapter 4. "Modularity", in Buller, David J. The Adapted Mind: Evolutionary Psychology. The MIT Press. pp. 127 – 201
  16. ^ a b Davies, Paul Sheldon; Fetzer, James H.; Foster, Thomas R. (1995). "Logical reasoning and domain specificity". Biology and Philosophy. 10 (1): 1–37. doi:10.1007/BF00851985. S2CID 83429932.
  17. ^ O'Brien, David; Manfrinati, Angela (2010). "The Mental Logic Theory of Conditional Propositions". In Oaksford, Mike; Chater, Nick (eds.). Cognition and Conditionals: Probability and Logic in Human Thinking. New York: Oxford University Press. pp. 39–54. ISBN 978-0-19-923329-8.
  18. ^ Lloyd, Elizabeth A. (1999). "Evolutionary Psychology: The Burdens of Proof" (PDF). Biology and Philosophy. 19 (2): 211–233. doi:10.1023/A:1006638501739. S2CID 1929648. Retrieved October 6, 2014.
  19. ^ Peters, Brad M. (2013). http://modernpsychologist.ca/wp-content/uploads/2011/12/EP-Neglecting-Neurobiology-in-Defining-the-Mind1.pdf (PDF) Theory & Psychology https://journals.sagepub.com/doi/10.1177/0959354313480269
  20. ^ Wallace, B. (2010). Getting Darwin Wrong: Why Evolutionary Psychology Won't Work. Exeter, UK: Imprint Academic.
  21. ^ Buller, David J. (2005). "Evolutionary psychology: the emperor's new paradigm" (PDF). Trends in Cognitive Sciences. 9 (6): 277–283. doi:10.1016/j.tics.2005.04.003. hdl:10843/13182. PMID 15925806. S2CID 6901180. Retrieved March 23, 2013.
  22. ^ Buller, David J.; Hardcastle, Valerie (2000). "Evolutionary Psychology, Meet Developmental Neurobiology: Against Promiscuous Modularity" (PDF). Brain and Mind. 1 (3): 307–325. doi:10.1023/A:1011573226794. S2CID 5664009. Retrieved March 23, 2013.
  23. ^ Buller, David J. (2005). "Get Over: Massive Modularity" (PDF). Biology & Philosophy. 20 (4): 881–891. doi:10.1007/s10539-004-1602-3. S2CID 34306536. Archived from the original (PDF) on March 17, 2015. Retrieved March 23, 2013.
  24. ^ Uttal, William R. (2003). The New Phrenology: The Limits of Localizing Cognitive Processes in the Brain. Cambridge, Massachusetts: MIT Press.
  25. ^ Donald, A Mind So Rare: The Evolution of Human Consciousness [1].

Further reading[edit]

  • Barrett, H.C.; Kurzban, R. (2006). "Modularity in cognition: Framing the debate" (PDF). Psychological Review. 113 (3): 628–647. doi:10.1037/0033-295X.113.3.628. PMID 16802884.
  • Pylyshyn, Z.W. (1984). Computation and cognition: Toward a foundation for cognitive science. Cambridge, Massachusetts: MIT Press (Also available through CogNet).
  • Animal Minds: Beyond Cognition to Consciousness Donald R. Griffin, University of Chicago Press, 2001 (ISBN 0226308650)
  • Shallice, Tim, & Cooper, Rick. (2011). The Organisation of Mind. Oxford: Oxford University Press. Chapter 3: Bridging the Theoretical Gap: from the Brain to Cognitive Theory (pp. 67–107).

Online videos[edit]