Consciousness: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
→‎Could a machine ever be conscious?: Clarified Chinese Room argument and Situated Cognition
Looie496 (talk | contribs)
→‎States of consciousness: revise, and add material and sources
Line 118: Line 118:
===States of consciousness===
===States of consciousness===
[[File:Abbot of Watkungtaphao in Phu Soidao Waterfall.jpg|thumb|right|150px|A Buddhist monk meditating]]
[[File:Abbot of Watkungtaphao in Phu Soidao Waterfall.jpg|thumb|right|150px|A Buddhist monk meditating]]
There are some states in which consciousness seems to be abolished, including sleep, coma, and death. There are also a variety of circumstances that can change the relationship between the mind and the world in less drastic ways, producing what are known as [[altered states of consciousness]]. Some altered states occur naturally; others can be produced by drugs or brain damage.
The three most widely accepted states of consciousness are sleeping, dreaming and waking. In addition to the obvious subjective differences each state has a corresponding pattern of physiological functioning in terms of EEG, biochemical markers, and (in dreaming) rapid eye movements.<ref>{{cite book| author=Moorcraft, W.H. |year=2005 |title=Understanding Sleep and Dreaming |publisher=Springer}}</ref>


The two most widely accepted altered states are [[sleep]] and [[dream]]ing. Although dream sleep and non-dream sleep appear very similar to an outside observer, each is associated with a distinct pattern of brain activity, metabolic activity, and eye movement<ref>{{cite book| author=W. H. Moorcraft |year=2005 |title=Understanding Sleep and Dreaming |publisher=Springer}}</ref>; each is also associated with a distinct pattern of experience and cognition. During ordinary non-dream sleep, people who are awakened report only vague and sketchy thoughts, and their experiences do not cohere into a continuous narrative. During dream sleep, in contrast, people who are awakened report rich and detailed experiences in which events form a continuous progression, which may however be interrupted by bizarre or fantastic intrusions. Thought processes during the dream state also frequently show a high level of irrationality. Both dream and non-dream states are also associated with severe disruption of memory: it usually disappears in seconds during the non-dream state, and in minutes after awakening from a dream unless actively refreshed.
Brain chemistry affects human consciousness. [[Sedative|Sleeping drugs]] such as [[midazolam]] (Dormicum) can bring the brain from the awake condition (conscious) to the sleep (unconscious) condition. Wake-up drugs such as [[flumazenil]] reverse this process. Many other drugs (such as [[ethanol|alcohol]], [[nicotine]], [[Tetrahydrocannabinol]] (THC), [[heroin]], [[cocaine]], [[LSD]], [[MDMA]], [[caffeine]]), have a consciousness-changing effect.{{Citation needed|date=August 2010}}


A variety of [[psychoactive drug]]s have notable effects on consciousness. These range from a simple dulling of awareness produced by [[sedative]]s, to increases in the intensity of sensory qualities produced by [[stimulant]]s, [[cannabis]], or most notably by the class of drugs known as [[psychedelic]]s. [[LSD]], [[mescaline]], [[psilocybin]], and others in this group can produce major distortions of perception, including hallucinations; some users even describe their drug-induced experiences as mystical or spiritual in quality. The brain mechanisms underlying these effects are not well understood, but there is substantial evidence that alterations in the brain system that uses the chemical neurotransmitter [[serotonin]] play an essential role.
There has been some research into physiological changes in yogis and people who practise various techniques of meditation. Recent research with brain waves during meditation has shown a distinct difference between those corresponding to ordinary relaxation and those corresponding to meditation.<ref>{{cite web|url=http://www.sciencedaily.com­ /releases/2010/03/100319210631.htm|title=Brain waves and meditation}}</ref> It is contested, however, whether there is enough evidence to count these as physiologically distinct states of consciousness.<ref name=MurphyMeditation>{{cite book |authors=Murphy M., Donovan S., Taylor E |year=1997 |title=The Physical and Psychological Effects of Meditation: A Review of Contemporary Research With a Comprehensive Bibliography, 1931-1996 |publisher=Institute of Noetic Sciences}}</ref>

There has been some research into physiological changes in yogis and people who practise various techniques of meditation. Recent research with brain waves during meditation has shown a distinct difference between those corresponding to ordinary relaxation and those corresponding to meditation.<ref>{{cite web|url=http://www.sciencedaily.com­ /releases/2010/03/100319210631.htm|title=Brain waves and meditation}}</ref> It is dispute, however, whether there is enough evidence to count these as physiologically distinct states of consciousness.<ref name=MurphyMeditation>{{cite book |authors=Murphy M., Donovan S., Taylor E |year=1997 |title=The Physical and Psychological Effects of Meditation: A Review of Contemporary Research With a Comprehensive Bibliography, 1931-1996 |publisher=Institute of Noetic Sciences}}</ref>

The most extensive study of the characteristics of altered states of consciousness was made by psychologist [[Charles Tart]] in the 1960s and 1970s. Tart analyzed a state of consciousness as made up of a number of component processes, including exteroception (sensing the external world); interoception (sensing the body); input-processing (seeing meaning); emotions; memory; time sense; sense of identity; evaluation and cognitive processing; motor output; and interaction with the environment.<ref>{{cite book |author=[[Charles Tart]] |title=States of Consciousness |chapter=Chapter 2. The components of consciousness |year=2001 |publisher=IUniverse.com |isbn=9780595151967 |url=http://www.psychedelic-library.org/soc2.htm}}</ref> Each of these, in his view, could be altered in multiple ways by drugs or other manipulations. The components that Tart identified have not, however, been validated by empirical studies. Research in this area has not yet reached firm conclusions, but a recent questionnaire-based study identified eleven significant factors contributing to drug-induced states of consciousness: experience of unity; spiritual experience; blissful state; insightfulness; disembodiment; impaired control and cognition; anxiety; complex imagery; elementary imagery; audio-visual [[synesthesia]]; and changed meaning of percepts.<ref>{{cite journal |title=Psychometric evaluation of the altered states of consciousness rating scale (OAV)| authors=E. Studerus, A. Gamma, F. X. Vollenweider |journal=[[PLoS ONE]] |year=2010 |volume=8 |pmid=20824211 |pmc=2930851}}</ref>


==Medical aspects==
==Medical aspects==

Revision as of 17:27, 15 September 2011

Representation of consciousness from the seventeenth century.

Consciousness is a term that refers to a variety of aspects of the relationship between the mind and the world with which it interacts.[1] It has been defined as: subjectivity; awareness; the ability to experience feelings; wakefulness; having a sense of selfhood; or the executive control system of the mind.[2] Despite the difficulty of definition, many philosophers believe that there is a broadly shared underlying intuition about what consciousness is.[3] As Max Velmans and Susan Schneider wrote in The Blackwell Companion to Consciousness: "Anything that we are aware of at a given moment forms part of our consciousness, making conscious experience at once the most familiar and most mysterious aspect of our lives."[4]

Philosophers since the time of Descartes and Locke have struggled to comprehend the nature of consciousness and pin down its essential properties. Issues of concern in the philosophy of consciousness include whether the concept is fundamentally valid; whether consciousness can ever be explained mechanistically; whether non-human consciousness exists and if so how it can be recognized; how consciousness relates to language; and whether it may ever be possible for computers to achieve a conscious state. Perhaps the thorniest issue is whether consciousness can be understood in a way that does not require a dualistic distinction between mental and physical entities.

At one time consciousness was viewed with skepticism by many scientists, but in recent years it has been a significant topic of research in psychology and neuroscience. The primary focus is on understanding what it means biologically and psychologically for information to be present in consciousness—that is, on determining the neural and psychological correlates of consciousness. The majority of experimental studies assess consciousness by asking human subjects for a verbal report of their experiences (e.g., "tell me if you notice anything when I do this"). Issues of interest include phenomena such as subliminal perception, blindsight, denial of impairment, and altered states of consciousness produced by psychoactive drugs or spiritual or meditative techniques.

In medicine, consciousness is assessed by observing a patient's arousal and responsiveness, and can be seen as a continuum of states ranging from full alertness and comprehension, through disorientation, delirium, loss of meaningful communication, and finally loss of movement in response to painful stimuli.[5] Issues of practical concern include how the presence of consciousness can be assessed in severely ill, comatose, or anesthetized people, and how to treat conditions in which consciousness is impaired or disrupted.[6]

Etymology and early history

John Locke

The origin of the modern concept of consciousness is often attributed to John Locke's Essay Concerning Human Understanding, published in 1690.[7] Locke defined consciousness as “the perception of what passes in a man’s own mind.”[8] His essay had much influence on the 18th century view of consciousness, and his definition appeared in Samuel Johnson's celebrated Dictionary (1755).

The earliest English language uses of "conscious" and "consciousness" date back, however, to the 1500s. The English word "conscious" originally derived from the Latin conscius (con- "together" + scire "to know"), but the Latin word did not have the same meaning as our word — it meant knowing with, in other words having joint or common knowledge with another.[9] There were, however, many occurrences in Latin writings of the phrase conscius sibi, which translates literally as "knowing with oneself", or in other words sharing knowledge with oneself about something. This phrase had the figurative meaning of knowing that one knows, as the modern English word "conscious" does. In its earliest uses in the 1500s, the English word "conscious" retained the meaning of the Latin conscius. For example Thomas Hobbes in Leviathan wrote: "Where two, or more men, know of one and the same fact, they are said to be Conscious of it one to another." The Latin phrase conscius sibi, whose meaning was more closely related to the current concept of consciousness, was rendered in English as "conscious to oneself" or "conscious unto oneself". For example, Archbishop Ussher wrote in 1613 of "being so conscious unto myself of my great weakness".[10] Locke's definition from 1690 illustrates that a gradual shift in meaning had taken place.

A related word was conscientia, which primarily means moral conscience. In the literal sense, "conscientia" means knowledge-with, that is, shared knowledge. The word first appears in Latin juridical texts by writers such as Cicero.[11] Here, conscientia is the knowledge that a witness has of the deed of someone else.[12] René Descartes (1596–1650) is generally taken to be the first philosopher to use "conscientia" in a way that does not fit this traditional meaning.[13] Descartes used "conscientia" the way modern speakers would use "conscience." In Search after Truth he says "conscience or internal testimony" (conscientia vel interno testimonio).[14]

In philosophy

The philosophy of mind has given rise to many stances regarding consciousness. Any attempt to impose an organization on them is bound to be somewhat arbitrary. Stuart Sutherland exemplified the difficulty in the entry he wrote for the 1989 version of the Macmillan Dictionary of Psychology:

Consciousness—The having of perceptions, thoughts, and feelings; awareness. The term is impossible to define except in terms that are unintelligible without a grasp of what consciousness means. Many fall into the trap of equating consciousness with self-consciousness—to be conscious it is only necessary to be aware of the external world. Consciousness is a fascinating but elusive phenomenon: it is impossible to specify what it is, what it does, or why it has evolved. Nothing worth reading has been written on it.[15]

Most writers on the philosophy of consciousness have been concerned to defend a particular point of view, and have organized their material accordingly. For surveys, perhaps the most common approach is to follow a historical path by associating stances with the philosophers who are most most strongly associated with them, for example Descartes, Locke, Kant, etc. The main alternative, followed in the present article, is to organize philosophical stances according to the answers they give to a set of basic questions about the nature and status of consciousness.

Is consciousness a valid concept?

A majority of philosophers have felt that the word consciousness names a genuine entity, but some who belong to the physicalist and behaviorist schools have not been convinced. Many scientists have also been skeptical. The most compelling argument for the existence of consciousness is that the vast majority of mankind have an overwhelming intuition that there truly is such a thing.[16] Skeptics argue that this intuition, in spite of its compelling quality, is false, either because the concept of consciousness is intrinsically incoherent, or because our intuitions about it are based in illusions. Gilbert Ryle, for example, argued that traditional understanding of consciousness depends on a Cartesian dualist outlook that improperly distinguishes between mind and body, or between mind and world. He proposed that we speak not of minds, bodies, and the world, but of individuals, or persons, acting in the world. Thus, by speaking of 'consciousness' we end up misleading ourselves by thinking that there is any sort of thing as consciousness separated from behavioral and linguistic understandings.[17]

Many philosophers and scientists have been unhappy about the difficulty of producing a definition that does not involve circularity or fuzziness. The neuroscientist Antonio Damasio, for example, calls consciousness "the feeling of what happens", and defines it as "an organism's awareness of its own self and its surroundings".([18], p. 4) These formulations seem intuitively reasonable, but they are difficult to apply to specific situations.

Is it a single thing?

Many philosophers have argued that consciousness is a unitary concept that is understood intuitively by the majority of people in spite of the difficulty in defining it. Others, though, have argued that the level of disagreement about the meaning of the word indicates that it either means different things to different people, or else is an umbrella term encompassing a variety of distinct meanings with no simple element in common.

Ned Block proposed a distinction between two types of consciousness that he called phenomenal (P-consciousness) and access (A-consciousness).[19] P-consciousness, according to Block, is simply raw experience: it is moving, colored forms, sounds, sensations, emotions and feelings with our bodies and responses at the center. These experiences, considered independently of any impact on behavior, are called qualia. A-consciousness, on the other hand, is the phenomenon whereby information in our minds is accessible for verbal report, reasoning, and the control of behavior. So, when we perceive, information about what we perceive is access conscious; when we introspect, information about our thoughts is access conscious; when we remember, information about the past is access conscious, and so on. Although some philosophers, such as Daniel Dennett, have disputed the validity of this distinction,[20] others have broadly accepted it. David Chalmers has argued that A-consciousness can in principle be understood in mechanistic terms, but that understanding P-consciousness is much more challenging: he calls this the hard problem of consciousness.[21]

Some philosophers believe that Block's two types of consciousness are not the end of the story. William Lycan, for example, argued in his book Consciousness and Experience that at least eight clearly distinct types of consciousness can be identified (organism consciousness; control consciousness; consciousness of; state/event consciousness; reportability; introspective consciousness; subjective consciousness; self-consciousness) — and that even this list omits several more obscure forms.[22]

How does it relate to the physical world?

Illustration of dualism by René Descartes. Inputs are passed by the sensory organs to the pineal gland and from there to the immaterial spirit.

The first influential philosopher to discuss this question specifically was Descartes, and the answer he gave is known as Cartesian dualism. Descartes proposed that consciousness resides within an immaterial domain he called res cogitans (the realm of thought), in contrast to the domain of material things which he called res extensa (the realm of extension).[23] He suggested that the interaction between these two domains occurs inside the brain, perhaps in a small midline structure called the pineal gland.[24]

Although it is widely accepted that Descartes explained the problem cogently, few later philosophers have been happy with his solution, and his ideas about the pineal gland have especially been ridiculed. Alternative solutions, however, have been very diverse. They can be divided broadly into two categories: dualist solutions that maintain Descartes's rigid distinction between the realm of consciousness and the realm of matter but give different answers for how the two realms relate to each other; and monist solutions that maintain that there is really only one realm of being, of which consciousness and matter are both aspects. Each of these categories itself contains numerous variants. The two main types of dualism are substance dualism (which holds that the mind is formed of a distinct type of substance not governed by the laws of physics) and property dualism (which holds that the laws of physics are universally valid but cannot be used to explain the mind). The three main types of monism are physicalism (which holds that the mind consists of matter organized in a particular way), idealism (which holds that only thought truly exists and matter is merely an illusion), and neutral monism (which holds that both mind and matter are aspects of a distinct essence that is itself identical to neither of them). There are also, however, a large number of idiosyncratic theories that cannot cleanly be assigned to any of these camps.

Since the dawn of Newtonian science with its vision of simple mechanical principles governing the entire universe, some philosophers have been tempted by the idea that consciousness could be explained in purely physical terms. The first influential writer to propose such an idea explicitly was Julien Offray de La Mettrie, in his book Man a Machine (L'homme machine).[25]

The most influential modern physical theories of consciousness are based on psychology and neuroscience. Theories proposed by neuroscientists such as Gerald Edelman[26] and Antonio Damasio,[18] and by philosophers such as Daniel Dennett,[27] seek to explain access consciousness and phenomenal consciousness in terms of neural events occurring within the brain. Many other neuroscientists, such as Christof Koch,[28] have explored the neural basis of consciousness without attempting to frame all-encompassing global theories. At the same time, computer scientists working in the field of Artificial Intelligence have pursued the goal of creating digital computer programs that can simulate or embody consciousness.

Several physicists have argued that classical physics is intrinsically incapable of explaining the holistic aspects of consciousness, but that quantum theory provides the missing ingredients. Several theorists have therefore proposed quantum mind (QM) theories of consciousness; the most notable theories falling into this category include the Holonomic brain theory of Karl H. Pribram and David Bohm, and the Orch-OR theory formulated by Stuart Hameroff and Roger Penrose. Some of these QM theories offer descriptions of phenomenal consciousness, as well as QM interpretations of access consciousness. None of the quantum mechanical theories has been confirmed by experiment, and many scientists and philosophers consider the arguments for an important role of quantum phenomena to be unconvincing.[29]

Some theorists hold that phenomenal consciousness in particular creates an explanatory gap. Colin McGinn takes the New Mysterianism position that it can't be solved, and David Chalmers criticizes purely physical accounts of mental experiences based on the idea that philosophical zombies are logically possible and supports property dualism.

How does it relate to language?

In humans, the clearest visible indication of consciousness is the ability to use language. Medical assessments of consciousness rely heavily on an ability to respond to questions and commands, and in scientific studies of consciousness, the usual criterion for awareness is verbal report (that is, subjects are deemed to be aware if they say that they are). Thus there is a strong connection between consciousness and language at a practical level. Philosophers differ, however, on whether language is essential to consciousness or merely the most powerful tool for assessing it.

Descartes believed that language and consciousness are bound tightly together. He thought that many of the behaviors humans share with other animals could be explained by physical processes such as reflexes, but that language could not be: he took the fact that animals lack language to be an indication that they lack access to res cogitans, the realm of thought. Others have reached similar conclusions, though sometimes for different reasons. Julian Jaynes argued in The Origin of Consciousness in the Breakdown of the Bicameral Mind that for consciousness to arise, language needs to have reached a fairly high level of complexity. Merlin Donald also argued for a critical dependence of consciousness on the ability to use symbols in a sophisticated way.[30]

Those are, however, minority views. If language is essential, then speechless humans (infants, feral children, aphasics, etc.) could not be said to be conscious, a conclusion that the majority of philosophers have resisted. The implication that only humans, and not other animals, are conscious is also widely resisted by theorists such as evolutionary psychologists, as well as by animal rights activists.[28]

Why do people believe that other people are conscious?

Many philosophers consider experience to be the essence of consciousness, and believe that experience can only fully be known from the inside, subjectively. But if consciousness is subjective and not visible from the outside, why do the vast majority of people believe that other people are conscious, but rocks and trees are not?[31] This is the problem of other minds. It is particularly acute for people who believe in the possibility of philosophical zombies, that is, people who think it is possible in principle to have an entity that is physically indistinguishable from a human being and behaves like a human being in every way but nevertheless lacks consciousness.

The most commonly given answer is that we attribute consciousness to other people because we see that they resemble us in appearance and behavior: we reason that if they look like us and act like us, they must be like us in other ways, including having experiences of the sort that we do. There are, however, a variety of problems with that explanation. For one thing, it seems to violate the principle of parsimony, by postulating an invisible entity that is not necessary to explain what we observe. Some philosophers, such as Daniel Dennett in an essay titled The Unimaginable Preposterousness of Zombies, argue that people who give this explanation do not really understand what they are saying. More broadly, philosophers who do not accept the possibility of zombies generally believe that consciousness is reflected in behavior (including verbal behavior), and that we attribute consciousness on the basis of behavior. A more straightforward way of saying this is that we attribute experiences to people because of what they can do, including the fact that they can tell us about their experiences.

How can we know whether non-human animals are conscious?

Am I conscious?

The topic of animal consciousness is beset by a number of difficulties. It poses the problem of other minds in an especially severe form, because animals, lacking language, cannot tell us about their experiences. Also, it is difficult to reason objectively about the question, because a denial that an animal is conscious is often taken to imply that it does not feel, its life has no value, and that harming it is not morally wrong. Descartes, for example, has sometimes been blamed for mistreatment of animals due to the fact that he believed only humans have a non-physical mind. Most people have a strong intuition that some animals, such as dogs, are conscious, while others, such as insects, are not; but the sources of this intuition are not obvious.

Philosophers who consider subjective experience the essence of consciousness also generally believe, as a correlate, that the existence and nature of animal consciousness can never rigorously be known. Thomas Nagel spelled out this point of view in an influential essay titled What Is it Like to Be a Bat?[32] He said that an organism is conscious "if and only if there is something that it is like to be that organism — something it is like for the organism"; and he argued that no matter how much we know about an animal's brain and behavior, we can never really put ourselves into the mind of the animal and experience its world in the way it does itself. Other thinkers, such as Douglas Hofstadter, dismiss this argument as incoherent.[33] Several psychologists and ethologists have argued for the existence of animal consciousness by describing a range of behaviors that appear to show animals holding beliefs about things they cannot directly perceive — Donald Griffin's 2001 book Animal Minds reviews a substantial portion of the evidence.[34]

Could a machine ever be conscious?

File:Science Fiction Museum and Hall of Fame 4.JPG
"Teddy" from the film A.I. Artificial Intelligence. Could a robot like this ever be conscious?

The idea that consciousness is fundamentally mechanical can be traced back at least to Julien Offray de La Mettrie's book from 1748.[25] The idea of an artifact made conscious is an ancient theme of mythology, appearing for example in the Greek myth of Pygmalion, who carved a statue that was magically brought to life, and in medieval Jewish stories of the Golem, a magically animated homunculus built of clay. (In many stories the Golem was mindless, but some gave it emotions or thoughts.) However, the possibility of actually constructing a conscious machine was probably first discussed by Ada Lovelace, in a set of notes written in 1842 about the Analytical Engine invented by Charles Babbage, a precursor (never built) to modern electronic computers. Lovelace was essentially dismissive of the idea that a machine such as the Analytical Engine could think in a humanlike way. She wrote:

It is desirable to guard against the possibility of exaggerated ideas that might arise as to the powers of the Analytical Engine. ... The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths. Its province is to assist us in making available what we are already acquainted with.[35]

File:ATY.logo5.jpg

One of the most influential contributions to this question was an essay written in 1950 by pioneering computer scientist Alan Turing, titled Computing Machinery and Intelligence. Turing disavowed any interest in terminology, saying that even "Can machines think?" is too loaded with spurious connotations to be meaningful; but he proposed to replace all such questions with a specific operational test, which has become known as the Turing test.[36] To pass the test a computer must be able to imitate a human well enough to fool interrogators. In his essay Turing discussed a variety of possible objections, and presented a counterargument to each of them. The Turing test is commonly cited in discussions of artificial intelligence as a proposed criterion for machine consciousness; it has provoked a great deal of philosophical debate. For example, Daniel Dennett and Douglas Hofstadter argue that anything capable of passing the Turing test is necessarily conscious,[37] while David Chalmers argues that a philosophical zombie could pass the test, yet fail to be conscious.[38]

Some philosophers have argued that it might be possible, at least in principle, for a machine to be conscious, but that this would not just be a matter of executing the right computer program. This viewpoint has most notably been advocated by John Searle, who defended it using a thought experiment that has become known as the Chinese room argument: Suppose the Turing test is conducted in Chinese rather than English, and suppose a computer program successfully passes it. Does the system that is executing the program understand Chinese? Searle's argument is that he could pass the Turing test in Chinese too, by executing exactly the same program as the computer, yet without understanding a word of Chinese. How does Searle know whether he understands Chinese? Understanding a language is not just something you do: it feels like something to understand Chinese (or any language). Understanding is conscious, and therefore there is more to it than merely executing a program, even a Turing-test-passing program.

Searle's essay has been second only to Turing's in the volume of debate it has generated[39], but Searle himself was vague about what extra ingredients it would take to make a machine conscious: All he proposed was that what was needed was "causal powers" of the sort that the brain has and that computers lack. But other thinkers sympathetic to his basic argument have suggested that the necessary (though perhaps still not sufficient) extra conditions may include the ability to pass not just the verbal version of the Turing test, but the robotic version[40], which requires grounding the robot's words in the robot's sensorimotor capacity to categorize and interact with the things in the world that its words are about, Turing-indistinguishably from a real person. Turing-scale robotics is an empirical branch of research on embodied cognition and situated cognition[41]

Scientific approaches

For many decades, consciousness as a research topic was avoided by the majority of mainstream scientists, because of a general feeling that a phenomenon defined in subjective terms could not properly be studied using objective experimental methods.[42] Starting in the 1980s, though, an expanding community of neuroscientists and psychologists have associated themselves with a field called Consciousness Studies, giving rise to a stream of experimental work published in journals such as Consciousness and Cognition, and methodological work published in journals such as the Journal of Consciousness Studies, along with regular conferences organized by groups such as the Association for the Scientific Study of Consciousness.

Modern scientific investigations into consciousness are based on psychological experiments (including, for example, the investigation of priming effects using subliminal stimuli), and on case studies of alterations in consciousness produced by trauma, illness, or drugs. Broadly viewed, scientific approaches are based on two core concepts. The first identifies the content of consciousness with the experiences that are reported by human subjects; the second makes use of the concept of consciousness that has been developed by neurologists and other medical professionals who deal with patients whose behavior is impaired. In either case, the ultimate goals are to develop techniques for assessing consciousness objectively in humans as well as other animals, and to understand the neural and psychological mechanisms that underlie it.

Measurement

The Necker Cube, an ambiguous image

Experimental research on consciousness presents special difficulties, due to the lack of a universally accepted operational definition. In the majority of experiments that are specifically about consciousness, the subjects are human, and the criterion that is used is verbal report: in other words, subjects are asked to describe their experiences, and their descriptions are treated as observations of the contents of consciousness. For example, subjects who stare continuously at a Necker Cube usually report that they experience it "flipping" between two 3D configurations, even though the stimulus itself remains the same. The objective is to understand the relationship between the conscious awareness of stimuli (as indicated by verbal report) and the effects the stimuli have on brain activity and behavior. In several paradigms, such as the technique of response priming, the behavior of subjects is clearly influenced by stimuli for which they report no awareness.[43]

Verbal report is widely considered to be the most reliable indicator of consciousness, but it raises a number of issues. For one thing, if verbal reports are treated as observations, akin to observations in other branches of science, then the possibility arises that they may contain errors — but is it possible for subjects to be wrong about their own experiences? If so, how could anybody tell? Daniel Dennett has argued for an approach he calls heterophenomenology, which means treating verbal reports as stories that may or may not be true, but his ideas about how to do this have not been widely adopted.[44] Another issue with verbal report as a criterion is that it restricts the field of study to humans who have language: this approach cannot be used to study consciousness in other species, pre-linguistic children, or people with types of brain damage that impair language. As a third issue, philosophers who dispute the validity of the Turing test may feel that it is possible, at least in principle, for verbal report to be dissociated from consciousness entirely: a philosophical zombie may give detailed verbal reports of awareness in the absence of any genuine awareness.[45]

Although verbal report is in practice the "gold standard" for ascribing consciousness, it is not the only possible criterion. In medicine, consciousness is assessed as a combination of verbal behavior, arousal, brain activity and purposeful movement. The last three of these can be used as indicators of consciousness when verbal behavior is absent. The scientific literature regarding the neural bases of arousal and purposeful movement is very extensive. Their reliability as indicators of consciousness is disputed, however, due to numerous studies showing that alert human subjects can be induced to behave purposefully in a variety of ways in spite of reporting a complete lack of awareness. Studies of the neuroscience of free will have also shown that the experiences that people report when they behave purposefully sometimes do not correspond to their actual behaviors or to the patterns of electrical activity recorded from their brains.

Another approach applies specifically to the study of self-consciousness, that is, the ability to distinguish oneself from others. In the 1970s Gordon Gallup developed an operational test for self-awareness, known as the mirror test. The test examines whether animals are able to differentiate between seeing themselves in a mirror versus seeing other animals. The classic example involves placing a spot of coloring on the skin or fur near the individual's forehead and seeing if they attempt to remove it or at least touch the spot, thus indicating that they recognize that the individual they are seeing in the mirror is themselves. Humans (older than 18 months) and other great apes, bottlenose dolphins, pigeons, and elephants[46] have all been observed to pass this test. The test is usually carried out with an identical 'spot' being placed elsewhere on the head with a non-visible material as a control, to assure the subject is not responding to the touch stimuli of the spot's presence.

Neural correlates

Schema of the neural processes underlying consciousness, from Christof Koch

A major part of the scientific literature on consciousness consists of studies that examine the relationship between the experiences reported by subjects and the activity that simultaneously takes place in their brains — that is, studies of the neural correlates of consciousness. The hope is to find that activity in a particular part of the brain, or a particular pattern of global brain activity, will be strongly predictive of conscious awareness. Several brain imaging techniques, such as EEG and fMRI, have been used for physical measures of brain activity in these studies.

One idea that has drawn attention for several decades is that consciousness is associated with high-frequency (gamma band) EEG oscillations. This idea arose from proposals in the 1980s, by Christof von der Malsburg and Wolf Singer, that gamma oscillations could solve the so-called binding problem, by linking information represented in different parts of the brain into a unified experience.[47] Rodolfo Llinás, for example, proposed that consciousness results from recurrent thalamo-cortical resonance where the specific thalamocortical systems (content) and the non-specific (centromedial thalamus) thalamocortical systems (context) interact in the gamma band frequency via temporal coincidence.[48]

A number of studies have shown that activity in primary sensory areas of the brain is not sufficient to produce consciousness: it is possible for subjects to report a lack of awareness even when areas such as the primary visual cortex show clear electrical responses to a stimulus. Higher brain areas are seen as more promising, especially the prefrontal cortex, which is involved in a range of higher cognitive functions collectively known as executive functions.[49] There is substantial evidence that a "top-down" flow of neural activity (i.e., activity propagating from the frontal cortex to sensory areas) is more predictive of conscious awareness than a "bottom-up" flow of activity.[50] The prefrontal cortex is not the only candidate area, however: studies by Nikos Logothetis and his colleagues have shown, for example, that visually responsive neurons in parts of the temporal lobe reflect the visual perception in the situation when conflicting visual images are presented to different eyes (i.e., bistable percepts during binocular rivalry).

The studies of blindsight — vision without awareness after lesions to parts of the visual system such as the primary visual cortex — performed by Lawrence Weiskrantz and David P. Carey provided important insights on how conscious perception arises in the brain [51] but again raise questions about the possibility or impossibility of philosophical zombies.

Biological function and evolution

Regarding the primary function of conscious processing, a recurring idea in recent theories is that phenomenal states somehow integrate neural activities and information-processing that would otherwise be independent (see review in Baars, 2002). This has been called the integration consensus. However, it remains unspecified which kinds of information are integrated in a conscious manner and which kinds can be integrated without consciousness. Nor is it explained what specific causal role conscious integration plays, nor why the same functionality cannot be achieved without consciousness. Obviously not all kinds of information are capable of being disseminated consciously (e.g., neural activity related to vegetative functions, reflexes, unconscious motor programs, low-level perceptual analyses, etc.) and many kinds of information can be disseminated and combined with other kinds without consciousness, as in intersensory interactions such as the ventriloquism effect [52]. Hence it remains unclear why any of it is unconscious.

As noted earlier, even among writers who consider consciousness to be a "definite thing," there is widespread dispute about which animals other than humans can be said to possess it.[53] Thus, any examination of the evolution of consciousness is faced with great difficulties. Nevertheless, some writers have argued that consciousness can be viewed from the standpoint of evolutionary biology as an adaptation in the sense of a trait that increases fitness.[54] In his paper "Evolution of consciousness," John Eccles argued that special anatomical and physical properties of the mammalian cerebral cortex gave rise to consciousness.[55] Bernard Baars proposed that once in place, this "recursive" circuitry may have provided a basis for the subsequent development of many of the functions that consciousness facilitates in higher organisms.[56]

Other philosophers, however, have suggested that consciousness would not be necessary for any functional advantage in evolutionary processes.[57] No one has given a causal explanation, they argue, of why it would not be possible for a functionally equivalent non-conscious organism (i.e., a philosophical zombie) to achieve the very same survival advantages as a conscious organism [58]. If evolutionary processes are blind to the difference between function F being performed by conscious organism O and non-conscious organism O*, it is unclear what adaptive advantage consciousness could provide.

States of consciousness

A Buddhist monk meditating

There are some states in which consciousness seems to be abolished, including sleep, coma, and death. There are also a variety of circumstances that can change the relationship between the mind and the world in less drastic ways, producing what are known as altered states of consciousness. Some altered states occur naturally; others can be produced by drugs or brain damage.

The two most widely accepted altered states are sleep and dreaming. Although dream sleep and non-dream sleep appear very similar to an outside observer, each is associated with a distinct pattern of brain activity, metabolic activity, and eye movement[59]; each is also associated with a distinct pattern of experience and cognition. During ordinary non-dream sleep, people who are awakened report only vague and sketchy thoughts, and their experiences do not cohere into a continuous narrative. During dream sleep, in contrast, people who are awakened report rich and detailed experiences in which events form a continuous progression, which may however be interrupted by bizarre or fantastic intrusions. Thought processes during the dream state also frequently show a high level of irrationality. Both dream and non-dream states are also associated with severe disruption of memory: it usually disappears in seconds during the non-dream state, and in minutes after awakening from a dream unless actively refreshed.

A variety of psychoactive drugs have notable effects on consciousness. These range from a simple dulling of awareness produced by sedatives, to increases in the intensity of sensory qualities produced by stimulants, cannabis, or most notably by the class of drugs known as psychedelics. LSD, mescaline, psilocybin, and others in this group can produce major distortions of perception, including hallucinations; some users even describe their drug-induced experiences as mystical or spiritual in quality. The brain mechanisms underlying these effects are not well understood, but there is substantial evidence that alterations in the brain system that uses the chemical neurotransmitter serotonin play an essential role.

There has been some research into physiological changes in yogis and people who practise various techniques of meditation. Recent research with brain waves during meditation has shown a distinct difference between those corresponding to ordinary relaxation and those corresponding to meditation.[60] It is dispute, however, whether there is enough evidence to count these as physiologically distinct states of consciousness.[61]

The most extensive study of the characteristics of altered states of consciousness was made by psychologist Charles Tart in the 1960s and 1970s. Tart analyzed a state of consciousness as made up of a number of component processes, including exteroception (sensing the external world); interoception (sensing the body); input-processing (seeing meaning); emotions; memory; time sense; sense of identity; evaluation and cognitive processing; motor output; and interaction with the environment.[62] Each of these, in his view, could be altered in multiple ways by drugs or other manipulations. The components that Tart identified have not, however, been validated by empirical studies. Research in this area has not yet reached firm conclusions, but a recent questionnaire-based study identified eleven significant factors contributing to drug-induced states of consciousness: experience of unity; spiritual experience; blissful state; insightfulness; disembodiment; impaired control and cognition; anxiety; complex imagery; elementary imagery; audio-visual synesthesia; and changed meaning of percepts.[63]

Medical aspects

In medicine, consciousness is measured by neuropsychological assessment.[64] It is of concern to patients, physicians, and ethicists as well as biological scientists and biomedical engineers. Patients may suffer from disorders of consciousness and seek medical treatment. Physicians may perform medical interventions of consciousness such as instructing the patient to sleep, administering general anesthesia, or inducing medical coma. Bioethicists and neuroethicists may be concerned with the ethical implications of consciousness in medical cases of patients such as Karen Ann Quinlan[65] and Terri Schiavo.[66] Furthermore, biological scientists study patients with these disorders while biomedical engineers develop neuroprosthetics for them.

Assessment

There are two commonly used methods for assessing the level of consciousness of a patient, a simple procedure that requires minimal training, and a more complex procedure that requires substantial expertise. The simple procedure begins by asking whether the patient is able to move and react to physical stimuli. If so, the next question is whether the patient can respond in a meaningful way to questions and commands. If so, the patient is asked for name, current location, and current day and time. A patient who can answer all of these questions correctly is said to be "oriented times three" (sometimes denoted "Ox3" on a medical chart), and is usually considered fully conscious. A patient who can additionally describe the current situation may be referred to as "oriented times four".

Glasgow Coma Scale for rating level of consciousness
Rating Eyes Verbal Motor
1 Does not open eyes Makes no sounds Makes no movements
2 Opens eyes in response to painful stimuli Incomprehensible sounds Extension to painful stimuli
3 Opens eyes in response to voice Utters inappropriate words Abnormal flexion to painful stimuli
4 Opens eyes spontaneously Confused, disoriented Flexion / Withdrawal to painful stimuli
5 N/A Oriented, converses normally Localizes painful stimuli
6 N/A N/A Obeys commands

The more complex procedure is known as a neurological examination, and is usually carried out by a neurologist in a hospital setting. A formal neurological examination runs through a precisely defined series of tests, beginning with tests for basic sensorimotor reflexes, and culminating with tests for sophisticated use of language. The outcome may be summarized using the Glasgow Coma Scale, which yields a number in the range 3—15, with a score of 3 indicating brain death (the lowest level of consciousness), and 15 indicating full consciousness. The Glasgow Coma Scale has three subscales, measuring the best motor response (ranging from "no motor response" to "obeys commands"), the best eye response (ranging from "no eye opening" to "eyes opening spontaneously") and the best verbal response (ranging from "no verbal response" to "fully oriented"). There is also a simpler pediatric version of the scale, for children too young to be able to use language.

Disorders of consciousness

Medical conditions that inhibit consciousness are considered disorders of consciousness.[67] This category generally includes minimally conscious state and persistent vegetative state, but sometimes also includes the less severe locked-in syndrome and more severe chronic coma.[67][68] Differential diagnosis of these disorders is an active area of biomedical research.[69][70][71] Finally, brain death results in an irreversible disruption of consciousness.[67] While other conditions may cause a moderate deterioration (e.g., dementia and delirium) or transient interruption (e.g., grand mal and petit mal seizures) of consciousness, they are not included in this category.

Disorder Description
Locked-in syndrome The patient has awareness, sleep-wake cycles, and meaningful behavior (viz., eye-movement), but is isolated due to quadriplegia and pseudobulbar palsy.
Minimally conscious state The patient has intermittent periods of awareness and wakefulness and displays some meaningful behavior.
Persistent vegetative state The patient has sleep-wake cycles, but lacks awareness and only displays reflexive and non-purposeful behavior.
Chronic coma The patient lacks awareness and sleep-wake cycles and only displays reflexive behavior.
Brain death The patient lacks awareness, sleep-wake cycles, and behavior.

See also

Template:Multicol


Template:Multicol-end

References

  1. ^ Robert van Gulick (2004). "Consciousness". Stanford Encyclopedia of Philosophy.
  2. ^ Farthing G (1992). The Psychology of Consciousness. Prentice Hall. ISBN 9780137286683.
  3. ^ John Searle (2005). "Consciousness". In Honderich T (ed.). The Oxford companion to philosophy. Oxford University Press. ISBN 9780199264797.
  4. ^ "Introduction". The Blackwell Companion to Consciousness. Malden MA: Wiley. 2008. ISBN 9780470751459. {{cite book}}: Cite uses deprecated parameter |authors= (help); Unknown parameter |editors= ignored (|editor= suggested) (help)
  5. ^ Güzeldere G (1997). The Nature of Consciousness: Philosophical debates. Cambridge, MA: MIT Press. pp. 1–67. {{cite book}}: Unknown parameter |editors= ignored (|editor= suggested) (help)
  6. ^ "Late recovery from the minimally conscious state: ethical and policy implications". Neurology. 68: 304–7. 2007. PMID 17242341. {{cite journal}}: Cite uses deprecated parameter |authors= (help)
  7. ^ Locke, John. "An Essay Concerning Human Understanding (Chapter XXVII)". Australia: University of Adelaide. Retrieved August 20, 2010.
  8. ^ "Science & Technology: consciousness". Encyclopedia Britannica. Retrieved August 20, 2010.
  9. ^ C. S. Lewis (1990). "Ch. 8: Conscience and conscious". Studies in words. Cambridge University Press. ISBN 9780521398312.
  10. ^ James Ussher, Charles Richard Elrington (1613). The whole works, Volume 2. Hodges and Smith. p. 417.
  11. ^ Hastings, John A.; Selbie (2003). Encyclopedia of Religion and Ethics Part 7. Kessinger Publishing. p. 41. ISBN 0766136779. {{cite book}}: More than one of |first1= and |first= specified (help) note: "In the sense of 'consciousness,' consientia is rare, but it is exceedingly common in most writers after Cicero with the meaning 'conscience'."
  12. ^ Melenaar, G. Mnemosyne, Fourth Series. Vol. 22. Brill. pp. 170–180. note: reference only that Cicero had been using the word.[1]
  13. ^ Hennig, Boris (2007). "Cartesian Conscientia". British Journal for the History of Philosophy. 15 (3): 455–484.
  14. ^ Heinämaa, Sara (2007). Consciousness: from perception to reflection in the history of philosophy. Springer. pp. 205–206. ISBN 978-1-4020-6081-6.
  15. ^ Stuart Sutherland (1989). "Consciousness". Macmillan Dictionary of Psychology. Macmillan. ISBN 9780333388297.
  16. ^ "Two conceptions of subjective experience". Philosophical Studies. 151: 299–327. 2010. doi:10.1007/s11098-009-9439-x. {{cite journal}}: Cite uses deprecated parameter |authors= (help)
  17. ^ Gilbert Ryle (1949). The Concept of Mind. University of Chicago Press. pp. 156–163. ISBN 9780226732961.
  18. ^ a b Antonio Damasio (1999). The Feeling of What Happens: Body and Emotion in the Making of Consciousness. New York: Harcourt Press. ISBN 9780156010757.
  19. ^ Ned Block (1998). "On a confusion about a function of consciousness". The Nature of Consciousness: Philosophical Debates. MIT Press. pp. 375–415. ISBN 9780262522106. {{cite book}}: Unknown parameter |editors= ignored (|editor= suggested) (help)
  20. ^ Daniel Dennett (2004). Consciousness Explained. Penguin. p. 375. ISBN 09713990376. {{cite book}}: Check |isbn= value: length (help)
  21. ^ David Chalmers (1995). "Facing up to the problem of consciousness". Journal of Consciousness Studies. 2: 200–219.
  22. ^ William Lycan (1996). Consciousness and Experience. MIT Press. pp. 1–4. ISBN 9780262121972.
  23. ^ Dy, Jr., Manuel B. (2001). Philosophy of Man: selected readings. Goodwill Trading Co. p. 97. ISBN 971-12-0245-X.
  24. ^ "Descartes and the Pineal Gland". Stanford University. November 5, 2008. Retrieved Aug.22, 2010. {{cite web}}: Check date values in: |accessdate= (help)
  25. ^ a b Julien Offray de La Mettrie (1996). Ann Thomson (ed.). Machine man and other writings. Cambridge University Press. ISBN 9780521478496.
  26. ^ Gerald Edelman (1993). Bright Air, Brilliant Fire: On the Matter of the Mind. Basic Books. ISBN 9780465007646.
  27. ^ Dennett D (1991). Consciousness Explained. Boston: Little & Company. ISBN 9780316180665.
  28. ^ a b Christof Koch (2004). The Quest for Consciousness. Englewood CO: Roberts & Company. ISBN 9780974707709.
  29. ^ Searle, J. (1997) The Mystery of Consciousness, London, Granta Books, p84
  30. ^ Merlin Donald (2002). A mind so rare: the evolution of human consciousness. W. W. Norton & Company. p. 202. ISBN 9780393323191.
  31. ^ Knobe J (2008). "Can a Robot, an Insect or God Be Aware?". Scientific American: Mind.
  32. ^ Thomas Nagel (1991). "Ch. 12 What is it like to be a bat?". Mortal Questions. Cambridge University Press. ISBN 9780521406765.
  33. ^ Douglas Hofstadter (1981). "Reflections on What Is It Like to Be a Bat?". The Mind's I. Basic Books. pp. 403–414. ISBN 046504624. {{cite book}}: Check |isbn= value: length (help); Unknown parameter |editors= ignored (|editor= suggested) (help)
  34. ^ Donald Griffin (2001). Animal Minds: Beyond Cognition to Consciousness. University of Chicago Press. ISBN 9780226308654.
  35. ^ Ada Lovelace. "Sketch of The Analytical Engine, Note G".
  36. ^ Stuart Shieber, ed. (2004). The Turing test : verbal behavior as the hallmark of intelligence. MIT Press. ISBN 9780262692939.
  37. ^ The Mind's I. Basic Books. 1985. ISBN 9780553345841. {{cite book}}: Cite uses deprecated parameter |authors= (help)
  38. ^ David Chalmers (1997). The Conscious Mind: In Search of a Fundamental Theory. Oxford University Press. ISBN 019511789. {{cite book}}: Check |isbn= value: length (help)
  39. ^ John Searle; et al. (1980). "Minds, brains, and programs". Behavioral and Brain Sciences. 3: 417–457. doi:10.1017/S0140525X00005756. {{cite journal}}: Explicit use of et al. in: |author= (help)
  40. ^ Oppy, Graham & Dowe, David (2011) The Turing Test. Stanford Encyclopedia of Philosophy.
  41. ^ Margaret Wilson (2002). "Six views of embodied cognition". Psychonomic Bulletin & Review. 9: 625–636.
  42. ^ Hendriks-Jansen, Horst (1996). Catching ourselves in the act: situated activity, interactive emergence, evolution, and human thought. Massachusetts Institute of Technology. p. 114. ISBN 0-262-08246-2.
  43. ^ "Criteria for unconscious cognition: Three types of dissociation". Perception and Psychophysics. 68: 489–504. 2006. {{cite journal}}: Cite uses deprecated parameter |authors= (help)
  44. ^ Daniel Dennett (2003). "Who's on first? Heterophenomenology explained". Journal of Consciousness Studies. 10: 19–30.
  45. ^ David J. Chalmers (1996). "Ch 3: Can consciousness be reductively explained?". The Conscious Mind. Oxford University Press. ISBN 9780195117899.
  46. ^ Elephants see themselves in the mirror - life - 30 October 2006 - New Scientist
  47. ^ Wolf Singer. "Binding by synchrony". Scholarpedia.
  48. ^ Rodolfo Llinás (2002). I of the Vortex. From Neurons to Self. MIT Press. ISBN 9780262621632.
  49. ^ FIXME Christof Koch and Francis Crick
  50. ^ FIXME Steven Wise, Mikhail Lebedev, Nikos Logothetis, etc
  51. ^ David P. Carey, Arash Sahraie, Ceri T. Trevethan & Larry Weiskrantz (2008) Does localisation blindsight extend to two-dimensional targets? Neuropsychologia 463053–3060
  52. ^ Ezequiel Morsella & John A. Bargh (2007) Supracortical consciousness: Insights from temporal dynamics, processing-content, and olfaction Behavioral and Brain Sciences 30(1): 100
  53. ^ Budiansky S (1998). If a Lion Could Talk: Animal Intelligence and the Evolution of Consciousness. New York: The Free Press. ISBN 9780684837109.
  54. ^ "Adaptive Complexity and Phenomenal Consciousness". Philosophy of Science. 67: 648–670. 2000. {{cite journal}}: Cite uses deprecated parameter |authors= (help)
  55. ^ Eccles JC (1992). "Evolution of consciousness". Proc. Natl. Acad. Sci. U.S.A. 89 (16): 7320–4. doi:10.1073/pnas.89.16.7320. PMC 49701. PMID 1502142. {{cite journal}}: Unknown parameter |month= ignored (help)
  56. ^ Baars BJ (1993). A Cognitive Theory of Consciousness. Cambridge University Press.
  57. ^ "Zombies and the function of consciousness". Journal of Consciousness Studies. 2: 313–21. 1995. {{cite journal}}: Cite uses deprecated parameter |authors= (help)
  58. ^ Harnad S (2002). "Turing Indistinguishability and the Blind Watchmaker". In Fetzer JH (ed.). Consciousness Evolving. John Benjamins. {{cite book}}: External link in |chapter= (help)
  59. ^ W. H. Moorcraft (2005). Understanding Sleep and Dreaming. Springer.
  60. ^ /releases/2010/03/100319210631.htm "Brain waves and meditation". {{cite web}}: Check |url= value (help); soft hyphen character in |url= at position 28 (help)
  61. ^ The Physical and Psychological Effects of Meditation: A Review of Contemporary Research With a Comprehensive Bibliography, 1931-1996. Institute of Noetic Sciences. 1997. {{cite book}}: Cite uses deprecated parameter |authors= (help)
  62. ^ Charles Tart (2001). "Chapter 2. The components of consciousness". States of Consciousness. IUniverse.com. ISBN 9780595151967.
  63. ^ "Psychometric evaluation of the altered states of consciousness rating scale (OAV)". PLoS ONE. 8. 2010. PMC 2930851. PMID 20824211. {{cite journal}}: Cite uses deprecated parameter |authors= (help)
  64. ^ Giacino JT, Smart CM (December 2007). "Recent advances in behavioral assessment of individuals with disorders of consciousness". Curr Opin Neurol. 20 (6): 614–619. doi:10.1097/WCO.0b013e3282f189ef. PMID 17992078.
  65. ^ Kinney HC, Korein J, Panigrahy A, Dikkes P, Goode R (26 May 1994). "Neuropathological findings in the brain of Karen Ann Quinlan -- the role of the thalamus in the persistent vegetative state". N Engl J Med. 330 (21): 1469–1475. PMID 8164698.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  66. ^ Goodnough A (16 Jun 2005). "Schiavo autopsy says brain, withered, was untreatable". The New York Times. p. A24.
  67. ^ a b c Bernat JL (8 Apr 2006). "Chronic disorders of consciousness". Lancet. 367 (9517): 1181–1192. doi:10.1016/S0140-6736(06)68508-5. PMID 16616561.
  68. ^ Bernat JL (20 Jul 2010). "The natural history of chronic disorders of consciousness". Neurol. 75 (3): 206–207. doi:10.1212/WNL.0b013e3181e8e960. PMID 20554939.
  69. ^ Coleman MR, Davis MH, Rodd JM, Robson T, Ali A, Owen AM, Pickard JD (September 2009). "Towards the routine use of brain imaging to aid the clinical diagnosis of disorders of consciousness". Brain. 132 (9): 2541–2552. doi:10.1093/brain/awp183. PMID 19710182.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  70. ^ Monti MM, Vanhaudenhuyse A, Coleman MR, Boly M, Pickard JD, Tshibanda L, Owen AM, Laureys S (18 Feb 2010). "Willful modulation of brain activity in disorders of consciousness". N Engl J Med. 362 (7): 579–589. doi:10.1056/NEJMoa0905370. PMID 20130250.{{cite journal}}: CS1 maint: multiple names: authors list (link)
  71. ^ Seel RT, Sherer M, Whyte J, Katz DI, Giacino JT, Rosenbaum AM, Hammond FM, Kalmar K, Pape TL; et al. (December 2010). "Assessment scales for disorders of consciousness: evidence-based recommendations for clinical practice and research". Arch Phys Med Rehabil. 91 (12): 1795–1813. doi:10.1016/j.apmr.2011.01.002. PMID 21112421. {{cite journal}}: Explicit use of et al. in: |author= (help)CS1 maint: multiple names: authors list (link)

External links

Template:Link GA