Talk:Sentience

From Wikipedia, the free encyclopedia
Jump to: navigation, search
          This article is of interest to the following WikiProjects:
WikiProject Philosophy (Rated C-class, High-importance)
WikiProject icon This article is within the scope of WikiProject Philosophy, a collaborative effort to improve the coverage of content related to philosophy on Wikipedia. If you would like to support the project, please visit the project page, where you can get more details on how you can help, and where you can join the general discussion about philosophy content on Wikipedia.
C-Class article C  This article has been rated as C-Class on the project's quality scale.
 High  This article has been rated as High-importance on the project's importance scale.
 
WikiProject Animal rights (Rated C-class, High-importance)
WikiProject icon This article is within the scope of WikiProject Animal rights, a collaborative effort to improve the coverage of animal rights on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion.
C-Class article C  This article has been rated as C-Class on the project's quality scale.
 High  This article has been rated as High-importance on the project's importance scale.
 
WikiProject Cognitive science (Rated Start-class, Low-importance)
WikiProject icon This article is within the scope of WikiProject Cognitive science, a collaborative effort to improve the coverage of Cognitive science on Wikipedia. If you would like to participate, please visit the project page, where you can join the discussion and see a list of open tasks.
Start-Class article Start  This article has been rated as Start-Class on the project's quality scale.
 Low  This article has been rated as Low-importance on the project's importance scale.
 

Media Ignorance[edit]

It should be noted that science fiction authors probably use sentience and sapience synonymously because they didn't do the research. 142.35.4.130 (talk) 17:25, 23 November 2010 (UTC)

Animal Rights Edits[edit]

"but allows non-human suffering. However, some animal rights activists hold that many of the distinguishing features of humanity - intelligence; language; self-awareness etc. - are not present in marginal cases such as young children or mentally disabled people, and conclude only what they call irrational speciesism can save this distinction."

I have edited this statement twice, removing

'However, some animal rights activists hold that many of the distinguishing features of humanity - intelligence; language; self-awareness etc. - are not present in marginal cases such as young children or mentally disabled people',

and replacing with simply

'However, many supposed distinguishing features of humanity - intelligence; language; self-awareness etc. - are not present in marginal cases such as young children or mentally disabled people'.

I feel this is a fair edit, as it is not disputed that human cases do exist where in these features are lacking (see 'Animal Liberation' - Peter Singer). If this were not the case, then Singer's argument for animal rights would be flawed. As it is standard scientific belief that intelligence differs from human to human, and that it is possible for humans to lack language ('A Man Without Words' - Susan Schaller), then I fail to see the need to add 'some animal rights activists hold...'. This part of the argument is not a claim from animal rights activists, but rather a scientifically accepted fact (that is that these distinctions vary within humanity) - the animal rights argument comes from the application of this fact.

Secondly, I have edited

'and conclude only what they call irrational speciesism can save this distinction.'

to

'and conclude only that speciesism (prejudice on the basis of species) is the mistaken justification."

Although I grant that this is not the best of wording (and needs to be altered), I stand by the spirit of the edit. Again, the point of the term 'speciesism' is to sum up prejudice on the basis of species. By definition, prejudice is not rational, and so again I fail to see the need to include 'only what they (animal rights supporters) call...'. If you wish to offer disputes (though I don't know of any specifically against this argument) to the conclusion that speciesism is the only justification left available, then by all means do so - but it is not correct to imply that only animal rights activists consider prejudice on the basis of species irrational. This is a given due to the implication of the words, and wouldn't be disputed by either side. Though some may attempt to point out alternative justifications that aren't matched by marginal cases (this is what some philosophers have attempted to do), none would question either that humans vary in the afore mentioned attributes, or that prejudice is irrational.

On this basis, it seems to me that, after agreeing on better wording, the statements should be reverted to more or less what I had written before the edit.otashiro 01:17, 3 July 2006 (UTC)

"By definition prejudice is not rational" is nonsense. Prejudice can be rational or irrational. I see a nice blue stone on the ground and I am hungry. However, I pre-judge that this stone will not be edible. This is completely rational. In recent centuries we have begun to overcome some of the very irrational prejudices that human beings hold involving one another. Thinking that all prejudice is irrational is the result of this. The very term "speciasism" is a false analogy. 65.79.173.135 (talk) 20:41, 13 April 2009 (UTC)Will in New Haven65.79.173.135 (talk) 20:41, 13 April 2009 (UTC)

These animal rights edits in the header have to stop. There are better places to try and communicate your ideals. What are you people trying to do, brainwash school children? Sentience has very little to do with animal rights. Whoever is doing this needs a big, fat block. --IronMaidenRocks (talk) 23:04, 30 December 2008 (UTC)
IronMaidenRocks: I agree with your point but not the way you make it. Arguments on whether appeals against Bentham's position on animal cruelty are fallacious or not, are not relevant to this article. Whether they are made by Animal Rights campaigners or not is irrelevant, as are the appeals and their status. I do agree though that the way the article gets bogged down in irrelevant details here is probably due to questions on animal rights, which belong somewhere else. My only objection is that it makes for a poor article - a stub with a dull debate injected into it. Dilaudid (talk) 11:41, 21 June 2012 (UTC)
You seem not to understand the connection between sentience and AR. You should also try to communicate in a more civil manner on the issue. The connection between sentience and AR is that animals generally, not only humans, are sentient, and that sentience is an essential feature for a being to have moral status, that is, broadly speaking, rights. I don't think there is too much reference to AR in the article as it is, at least in absolute terms; rather, I think the connections should be expanded. There are many other things to say too about sentience, particularly on the issues of mind/body, the philosophy of sentience, etc., so if you want to contribute, you are welcome; perhaps then the relative weight of AR in the article will be lower. However, I certainly think the animal question is more important than, say, that of alien life and so on. David Olivier (talk) 01:40, 31 December 2008 (UTC)
Both animal rights and science fiction belong in this article. A quick study of What links here shows that the term is mostly used by science fiction and related fields, followed by eastern philosophy and some animal rights. There are also plenty of links from western philosophy, neurology and the study of consciousness, although these fields tend to use a different and more precise terminology (e.g. "qualia", "consciousness", "NCC", "perception" etc.) and the moral dimension is completely absent. ---- CharlesGillingham (talk) 18:03, 31 December 2008 (UTC)

Maybe, instead of 'However, many supposed distinguishing features of humanity - intelligence; language; self-awareness etc. - are not present in marginal cases such as young children or mentally disabled people', it could be said that 'However, many supposed distinguishing features of humanity - intelligence; language; self-awareness etc. - are thought by (primarily?) animal rights activists to be not actually present in marginal cases - such as young children or mentally disabled people'.

It hasn't really been proven that such faculties aren't present in young children or mentally disabled people. Also, you should still note who gave this opinion (animal rights activists). Wikipedia encourages citations as always. 142.35.4.130 (talk) 17:29, 23 November 2010 (UTC)

I think it is rather clear that a new-born baby doesn't possess language, in the same sense that a cat doesn't possess language. To hold that it hasn't been "proven" is to demand an absurdly high level of proof, and by that standard it hasn't been proven that a stone can't talk either. The same goes for self-awareness and intelligence. David Olivier (talk) 09:07, 24 November 2010 (UTC)

Principles of Sentient and Sapient Life[edit]

Is there a proposed set of principles that can help determine or define sentient and sapient life?

Principles that might apply are self sacrifice, self awareness, creativity, level of intelligence (and how it would be determined), ability to hope, and even to reget.
These were ideas that I thought might apply.
I don't know of any agreed upon tests, although you seem to be talking about sapience instead of sentience. Humans are included, but some (e.g., Descartes) think only humans have it! (See the great ape project for a different opinion.) MShonle 01:46, 11 Jun 2005 (UTC)
Good point. I think I am looking for some combined definition sentient and sapient life. There may not be a nice compact definition. We will probably end up with a hybrid classification system.
Hmm... It's tricky to define. It seems the general logic is 'if it has emotions, it's sentient'. But then (too) many things can be described as sentient. Suppose you have a computer program that prompts repeatedly for 'Current Emotion Level?' and stores the result in a variable, but doesn't do anything with it. This computer experiences (manually input) emotions, but doesn't do anything with them. So let's add a new condition: 'it' has to act upon its emotions in order to be sentient. Still quite easy: if we go to the moon and visit dual-head universal Turing machines and say some arbitrary tape position represents emotion, you got your sentience. But this machine may actually be a 'smart machine'... in fact it is. But then we get all confused 'cuz we're standing on the wrong side of the road. Why *is* this sentient? Let's say it's not sentient, but we are. What can we do that this can't? Good luck if you find out. The Nobel Prize is waiting... --Ihope127 7 July 2005 22:15 (UTC)
Perhaps a more important question than deciding if other objects are sentient is are we ourselves "sentient?" By our definition, yes, we are. But what if an arbitrary construct decides that sentience, for example, can only exist for creatures of some certain defining quality? Sentience, as it is to us, the ability to reason and ponder our own existence, may not hold true for a being with a completely different physical or mental state. All I'm trying to say is that our idea of sentience is based on what we perceive as being sentient and self-aware. Our "sentience" may not necessarily be another beings "sentience." Nick 01:48, 8 June 2006 (UTC)




I'm not too sure about what previous users have said in this section. I'm pretty sure the definition of sentience is consciousness, that is awareness - sense. What you are commenting on may be more related to sapience.

sen·tient (snshnt, -sh-nt) adj.


1. Having sense perception; conscious: “The living knew themselves just sentient puppets on God's stage” (T.E. Lawrence).

2. Experiencing sensation or feeling. (Dictionary.com)

Sentience does not require emotions, or the ability to act on them as someone said previously (and in your example, it's very debateable over whether or not your computers are experiencing emotion anyway - they'd have to be sentient in the first place in order to feel emotion - in fact this is Descartes point: although animals react emotionally, and appear to be thinking and feeling, as they are not sentient, that is aware (as he claims) - thus they are mechanical - they act exactly as if they were aware of feelings, thoughts and sense, but they are not, simply being robotic. It's also worth noting that Descartes claims are strongly scientifically contested nowadays, biology extremely strongly implying he's wrong (putting aside philosophical debates of knowing other minds) and there are very few scientists who don't believe animals are aware, or sentient today).otashiro 23:23, 28 June 2006 (UTC)


Obviously the word sapient calls for a human-centric definition (homo sapiens - knowing _man_). So sapient means "possessing human intelligence", or more properly, possessing the level of intelligence of a human.

"Sapient" does not cover all of the bases here, however. No single word will. One needs words like sentience and phrases like self awareness, and so on, to further define AI.

I would advise people here to not get all twisted in tail biting circles. Instead, K.I.S.S. "Keep it simple, stupid." (affectionately said)71.112.38.38 (talk) —Preceding undated comment added 17:58, 5 May 2010 (UTC).


This article makes sentience seem like a black-and-white thing: a creature is either sentient or "non-sentient". It does not mention the idea that perhaps sentience can occur in different degrees. Creatures could have varying amounts of sentience.


Quoted: You seem not to understand the connection between sentience and AR. You should also try to communicate in a more civil manner on the issue. The connection between sentience and AR is that animals generally, not only humans, are sentient, and that sentience is an essential feature for a being to have moral status, that is, broadly speaking, rights.

Just saying, I think a creature needs sapience (the ability to think in the manner of a human being; such as communicating complex ideas through language) in order to have morals, let alone grasp them. Otherwise, merely sentient creatures only display morals when protecting their young or performing other functions related to instinct. If you consider morals to be the state of being 'good' or 'evil', note that neither is really well defined. On the other hand, perception of 'good' and 'evil' is largely confined the domain of humans - which regards morality and perhaps the ability to act in a moral manner.

Morals were originally rooted in philosophy. Even in the writing of old religious texts, theology played a part in determining what and what wasn't a sin. This required writers and thinkers. 142.35.4.130 (talk) 17:35, 23 November 2010 (UTC)

For clarification: having moral status doesn't mean having morals. Having moral status is being a moral patient, while having morals has to do with being a moral agent. A new-born baby has moral status, but does not have morals. David Olivier (talk) 09:35, 24 November 2010 (UTC)

"Woot!" is all I can say[edit]

So... I discovered as a direct result of my being human (yes. I *am* human) that the human mind is an odd and fascinating thing. What better thing for an entity to study than others of its class? The mind can have a personality and switch from one to another at will; multiple can even coexist quite peacefully and converse with each other before merging back into a whole (nonphysically of course). My thinking: why should this be limited to humans? We did intelligence tests on lots of animals, but how do we know this is an accurate measure? These animals often have no experience with these tests, however humans do such things all the time. It is tricky to teach these things to an animal, though... but is it really their mind that's at fault? Humans can articulate, and dolphins can click as such (I like that sound :-), but cats, dogs, rats, etc. converse entirely in speech disfluencies. Furthermore their "communication" with the physical world itself is limited: humans and other primates have hands, while cats only have paws. I feel that if animals could make better use of their brains, they would. I could be completely wrong, however--just how did we find that dolphins were as smart as they are? But if we gave them the right toys (Cetacean Linux? Could happen), we might be able to find out a lot. --Ihope127 03:34, 22 August 2005 (UTC)

Cats communicate using a 30-36 sign language, dolphins alone are able to solve "complex" "natural" problems that are not previously taught. nto having hands is a serious hindering factor in the evolution of their brains. The neocortex is almost non-existent in non-human animals.--Procrastinating@talk2me 11:52, 8 June 2006 (UTC)
Actually, that's not true - it is non-existent in fish, but in the other species groups, moving up the evolutionary scale it becomes increasingly larger, and is certainly not almost non-existent in mammals at least. It's also worth noting, just as a side point, that the neo-cortex' size or absence has no bearing on sentience, emotion or the ability to suffer (see 'Animals in Translation' Temple Grandin)otashiro 23:07, 28 June 2006 (UTC)

I think people discovered dolphin intelligence simply by having them available to study in aquariums. Dolphins came to aquariums because people thought 'oh hey, those things are cute'. Economically, there's a demand to see dolphins in aquariums and so somebody made it so. Hence, dolphins were available for studying. Evolution is a complex thing and dolphin intelligence might have also largely developed just by interacting with humans. Ie. a trainer can teach a dolphin a new trick. The dolphin might not want to learn the new trick until the human offers him incentive in the form of mackerel. The dolphin learns the trick and gets the just reward - y'know how they love mackerel. 142.35.4.130 (talk) 17:42, 23 November 2010 (UTC)

I think the above paragraph is a sad and narrow perspective on the interaction of dolphins with humans. What about the millennia of observations of their behaviour by Polynesian and sailors of other nationalities. --Graham Proud (talk) 13:15, 27 October 2013 (UTC)

Artifical Intelligence[edit]

"Some science fiction uses the term sentience to describe a species with human-like intelligence, but a more appropriate term is sapience."

I disagree, the only shows I've seen that consider a robotic "species" to be sentient are those in which they have more than just intelligence. Any programmable machine with enough memory storage can be taught to have "human-like intelligence", however they can't make choices for themselves, nor are they capable of any sort of feeling (as many in Sci-Fi are). --Anon.

I think the above misunderstands the term 'human like intelligence'. Any robot can be taught? As in, dynamic response to stimuli + emotions + ability to develop theories? We aren't discussing animated robots that merely look human, like those ones recently made in Japan. We are talking about honest to god human level intelligence (a creature that smiles isn't displaying humanity - showcasing language and theories, such as contributing to a business project, is showing humanity).

Obviously you've never taken software design. Please don't make ignorant statements like that.

That said - the discovery of other advanced life forms might lead to new definitions of intelligence. Sapience, in other words, may no longer just mean 'human' but have a more specific definition that relates to creatures of which humans are just one species among a possible multitude. 142.35.4.130 (talk) 17:48, 23 November 2010 (UTC)

In shows that's probably likely, because there are actors or voice actors playing the robots. But I think the sentence is primarily about sci-fi novels. MShonle 14:01, 8 September 2005 (UTC)
Then shouldn't that be stated? The live page simply says Sci-Fi, it doesn't say which medium. --Anon.
Anon, you aren't going to refute Strong AI by simply baldly stating that it is impossible and that things with "human-like intelligence" are nevertheless not conscious. --maru (talk) contribs 20:58, 18 July 2006 (UTC)
Anon, I'm confused. How are you disagreeing? You and the quoted sentence say the very same thing: Sci-Fi works often use "sentient" to mean "able to make judgments or choices." Do you mean that they are using the word correctly? Then you're wrong: sapient = able to make judgments; sentient = able to sense. Are you saying that "human-like intelligence" does not include the ability to make judgments? Fine, but that's not what this sentence is saying. That's what this sentence _assumes_. If that's your problem, change the wording. Solemnavalanche 06:08, 23 November 2006 (UTC)

Hotly debated?[edit]

A paragraph from the text:

Science is making some progress on animal psychology, and evidence of sentience is gradually being seen in animals, such as apes and dolphins. Still, it is a hotly debated issue.

This seems just so stupid. How can people debate if animals can feel? Are those people the blind or non-sentient ones? I was going to remove the paragraph since this should be obvious, and no further scientific research should be needed (it has even been proved that all the mammals share the neural patterns for basic feelings), but I'm letting people remark on this first. Rbarreira 15:45, 21 January 2006 (UTC)

I agree that the paragraph should be removed, but it's not obvious that animals are sentient, and it's not fair to describe it as stupid. I can't even be sure that any human other than myself is sentient. What about insects, bacteria, plants? Where does one draw the line? Very little about sentience is obvious. --Aaron McDaid 18:24, 22 January 2006 (UTC)
Considering the definition of sentience given in the text, I think that one can say that animals are sentient as much as human beings (a bit as you've pointed out). So, I don't think it's fair at all to put a focus on animals.
Considering that there are absolutely no references cited anywhere in the section, this discussion is completely useless. It would be nice if someone could take care of that. 24.251.0.143 03:25, 2 June 2006 (UTC)
Is it even clear that humans are sentient? I, or you, could easily be the only one who "is". There is no bullet-proof way to prove that you see in the same way as I do, or if you just react to objects recognized automatically by your brain, and seem from the outside to be existing in the same way, but are lacking the meaningless, undiagnosable thing that is sentience. (Some Zen stuff comes in here: Do you see the shape of a tree, or do you just see a tree?). The only way to figure out if someone exists in the same way as myself would be to become that person, and even if that was possible, there would still be the unanswerable question of whether my capacity for sentience is even connected to the physical world. Does it move across bodies? The assumption that sentience is possessed not just by oneself, but by all humans, is the most prominent and debatable assumption in life. Dylan McCall 19:15, 21 August 2007 (UTC)

The above can be countered by 'I think. Therefore I am.' In other words, we exist in the universe and can confirm are existence through our senses - we also need to confirm that others exist and can confirm their existence through their senses for the sake of sanity. That's why humans typically figure it necessary to socialize with one another every now and then. Animals on the other hand can't participate in every aspect of socializing as humans can with each other. We can only confirm what is present before us. Such is considered 'observational phenomena'. In other words, we need to assume that it exists because we can see it in front of us - in fact, documentation provides less proof than simply getting proof of something through our own senses. This is because documentation risks bias and is based on prior documentation based on notions and methodology of how to interpret the universe (the latter is the avenue of all core knowledge including science, but is nonetheless itself dependent on proof from our own senses!).

So that said... I think some general consensus needs to be agreed upon if we're going to continue discussion on the different intelligence values of life forms. Otherwise, we'll never reach it until science makes a new discovery - which will probably result from new technology providing a more in-depth look on one of the most fundamental questions of the universe relegated to psychology. Such a thing could take hundreds of years, however. So yeah.

So really, in regards to discussion - right now, I'm thinking of just three (besides number 0).

0. No Intelligence - Cannot react to stimuli and hence cannot contribute to its surroundings in anyway. Creatures of this manner are probably not organisms at all. They could be simple non-computerized mechanical devices. Most matter and energy in the universe falls into this category.

1. Contingency Based Intelligence - Reacts to stimuli, but in a confined manner that is limited in productivity or 'creativity'. Such might include plants and modern computers. Also known as 'robotic intelligence'.

2. Sentience - Reacts to stimuli in a very dynamic manner, beyond that which current software technology can render. Many perceive this state as debate-able and as such, this state of intelligence is the subject of much of philosophy, perhaps even before the existence of writing. Anything of this state could be considered capable of 'feeling', of which may or may not include basic emotions such as 'happiness', 'depression', and 'anger/aggressiveness'; but none of the more complex emotions such as 'frustration' or 'embarrassment'. As such, many animals could be relegated to this state of intelligence.

3. Sapience - Primarily associated only with humans (homo sapiens). On a less rigid manner, sapience relates to the ability to form ideas and express them in a detailed language that allows their distribution among those that can interpret the language - usually other sapient organisms; note however that sentient organisms can participate through interpreting simpler concepts; ie. a dog being ordered to 'fetch'. Contingency based creatures (non-organisms) can also be produced by humans in the form of computers and electronics.

Sapience, as the subject of debate, might be considered to allow more 'free will' since a more intelligent organism has the power to simply choose from a greater variety of memories and knowledge in order to assist it in making decisions. Better deciding power lends to greater creativity which lends to a natural tendency to think about things in general - hence the influence of philosophy in human civilization.

Opinion: It's possible that primates such as chimpanzees or other mammals such as dolphins could be sapient. Humans are definitely sapient although earlier hominids could easily be considered sapient as well, as far back as those whom began using tools (homo habilis?). Sapience is a rare thing, typical in only the highest life forms on earth - such organisms require ideal conditions where they are unhindered and allowed to develop mental faculties, whenever their evolution determines that their physiology is defined enough for their role in an ecosystem. 142.35.4.130 (talk) 18:12, 23 November 2010 (UTC)

useless debate? rewrite[edit]

how does one measure consciousness? are there level of it ? what is the threshold for sapienece? how can you device such an expirement ? what is the difference between selfawareness and consciousnees? are sentient being also sapient? if not how can you regard such a qualia? what did various peopel in history said about this ?

This article is VERY lacking, I marked it for rewrite and refferences.--Procrastinating@talk2me 11:01, 6 June 2006 (UTC)

Measuring consciousness is probably already part of another psychological article. The threshold of sapience cannot current be determined. The difference between self awareness and consciousness is that self aware creatures pass the 'mirror test' (doing so is considered a major leap, obviously) while those that do not are merely 'conscious' or 'sentient' in other words - meaning they can still react to stimuli and probably feel basic emotions. I don't think dogs pass the mirror test.

Sentient is not sapient - the two are different definitions. Sentience is a lot more general of a term than sapience since all existing sapient beings are also sentient - it will probably always be this way; unless you had, maybe, a cybernetic human that had none of its five senses but still depended on input from a controlled environment - of course, that's a sci fi notion. But not all sentient beings are sapient - that is to say, not all of them can express complex ideas through a medium (usually a complex and detailed language that exceeds 'clicks' or 'barks'). Of course, pre-sapient humans probably shouted in a variety of ways (maybe just intonations - not the actual tongue and lip work of genuine sapient language) that could have meant things like 'I'm in danger' or 'hello' or 'I want attention!' or 'I'm big and tough!' but didn't get much more complex than that. Most monkeys probably already communicate all that and the communication between a mother monkey and child monkey is all non-verbal since the mother monkey basically just controls the children (she usually grabs hold of them for breast feeding or whatever or for any other reason) until they decide to run off (usually when they're big enough themselves). 142.35.4.130 (talk) 18:25, 23 November 2010 (UTC)

Is there a word that means neither sentience nor sapience?[edit]

Is there a word other than "non-sentient" or "non-sapient" that exists which means the same thing? --24.18.98.100 02:47, 22 June 2006 (UTC)

Yes, the word is automata and implies robotic - that is not conscious (sentient), which of course entails not self-aware (sapient)

The word "bathtub" means neither sentience nor sapience. There are others. David Olivier 07:57, 7 December 2006 (UTC)

such as Nano-elephants, for example. --Procrastinating@talk2me 16:32, 7 December 2006 (UTC)

Synthesis via sentience[edit]

Sentience is the abductive part of Inquiry relative to Interaction. Data is brought into the body, compared to existing information (deductive), either rejected and/or stored in memory (inductive); or integrated (or synthesized) as knowledge, or sapience.

The integral process of synthesizing tools, including more advanced organs - is relative to conditions, or; conditional (see Interaction). The brain, as does all interconnected systems within our bodies; generates more flexibility, or adaptability to continue the advancement, or synthesis process.

These adaptations are encoded, fused, or synthesized - into our DNA and exercised from generation-to-generation. Major paradigm shifts of more advanced generations constitute oscillations that empower more tools, more synthesis - but not necessarily more code.

I don't undestand this too well, for if you happen to say that our experiences become, post hoc, encoded into our genetic material, that's Lysenkoism and obviously an error.

Artificial intelligence is a tool, as are we, in our struggle to advance ourselves through the diversity of origin - which constitutes the magnification of the simple synthesis process based on the irrefutable laws of physics. Hypotheses of the integration of AI and our bodies may be modeled by the similarity in nanotechnology patterns, and Synthesis patterns (see Technological Singularity as well).

The synchrony of magnified power has vacillating effects that mirror a fireworks universe. We may be on a path unfolding as vacillating patterns; but hypotheses such as RNA and PAH can empower us to break these patterns toward a more oscillating synthesis with our planet so we can persist..--Dialectic 22:43, 11 July 2006 (UTC)

I think the above just expressed everything I've been trying to say (of course, I've said it 4 years from then) in a way that everyone else understands. In wikipedia jargon in other words.  ;) 142.35.4.130 (talk) 18:29, 23 November 2010 (UTC)

Latin Root "sapere"[edit]

In the article sapience it says that the latin root sapere means "to taste", but in this article it says it means "to know". (Antonio.sierra 16:35, 2 March 2007 (UTC))

To "taste" is sapio. Check here: http://en.wiktionary.org/wiki/sapere 98.165.209.7 (talk) 04:34, 8 April 2008 (UTC)

Sentience definition is flawed[edit]

"refers to possession of sensory organs, the ability to feel or perceive" Having ears, eyes, etc. is not a sufficient condition for sentience as is implied by the definition.

Change to "refers to the capacity to feel or perceive" would make more sense.

This really needs a re-write. 163.1.42.12 17:04, 3 June 2007 (UTC)Tom

Thing is, pretty much every organism that has eyes or ears or whatever is sentient anyway - they all have the capacity to feel or perceive, otherwise evolution wouldn't have given them eyes or ears, etc. So the definition seems suitable as a generalization for people not really at all interested at looking further into the subject. 142.35.4.130 (talk) 18:32, 23 November 2010 (UTC)

It is an adequate definition, not a particularly good one, but considering how much we know about sentience and the actual physiological effects that cause it, we do not know if sentience requires sensory organs, after all if my brain were duplicated but detached from my senses, I would probably still exhibit sentience. So I agree it should be changed, but cutting out feel because there is no correlation between sentience and feeling, only perception. Mike of Wikiworld (talk) 12:29, 28 August 2011 (UTC)

AI Sentience[edit]

"(...) so the question arises as to whether computers with artificial intelligence will become sentient. (...)"
"(...) Sentience refers to possession of sensory organs, the ability to feel or perceive, not necessarily including the faculty of self-awareness. (...)"
By the definition, any video camera, audio recorder or some input device is sentient?
Or the device must analyze input to infer something about it? --200.223.68.5 21:52, 23 June 2007 (UTC)

According to the Merriam-Webster's dictionary, sentience is being "responsive to or conscious of sense impressions <sentient beings>". If you push a key, the computer hopefully responds. Then the computer is sentient, and so is an amoeba. Now consciousness, "the perception of what passes in a man's own mind" (Locke), is something completely different. If we replace "Mind" ("in the Western tradition, the complex of faculties involved in perceiving, remembering, considering, evaluating, and deciding" (Britannica)) with "inner states", it still becomes something else, less magnificent, and a computer can do it too (report inner states that only exist in it), though our communication with amoebas (and other systems) has been faulty. (Just for the record here, I must state that in my personal little opinion this sentience thing is a false problem.) --Xyzt1234 10:55, 3 August 2007 (UTC)
To my knowledge, the word "sentience" is only used in reference to artificial intelligence in fiction. I have rewritten this section from that perspective. If there is some source I have overlooked, feel free to cite it and change this back. ----CharlesGillingham (talk) 08:05, 13 December 2007 (UTC)

My own opinion has it that anything can respond to stimuli - but only sentience allows a varied enough response that adapts and considers input on a number of levels. In other words, a cow is usually considered quite stupid for an animal - but cows can respond in a variety of ways to being poked in the eye or bit by a flea. Software technology only allows a program to respond in accordance to one contingency - so if the fly bit the cow on its backside, then the cow would whip its tail 90 degrees clockwise but in absolutely no other pattern conceivable beyond that contingency. In real life, cows need to be able to respond in multiple ways to one contingency. Their reactions to stimuli are, in essence, far more dynamic than current computing technology can render. So, yeah, that's how I consider sentience or 'consciousness'.

Anything less than that kind of sentience is simply a contingency. I don't think jelly fish are sentient. I'm not sure if amoeba are smarter than present day bacteria - although bacteria do attract to food sources. Do jelly fish even eat? Plants certainly are not sentient and react to no stimuli except the presence of food sources by inducing photosynthesis or what have you. Many plants are probably just contingencies.

That's just my opinion on the matter. 142.35.4.130 (talk) 17:24, 23 November 2010 (UTC)

Confusing sentence[edit]

Dear friends, the author of this sentence better have a second go at this under "Philosophy and sentience"

"There continues to be much debate among philosophers, with many adamant that there is no really hard problem with sentience whatsoever."

While it may be true that there is much debate, one has to take that on faith at present. More important, and this is why I am really taking the time to write: - "...with many adamant that there is no really hard problem with sentience whatsoever." - is something close to gibberish or nonsense. Perhaps it is nonstandard English or maybe written by a person who does not have English as a mother tongue, or maybe she or he didn't really think the sentence through or didn't exactly know what they intended to say, but right now the useful content rating of the sentence is close to zero. I move it be struck until someone can articulate a new sentence that succeeds in adding knowledge and/or meaning. With good intentions and the desire to engage in constructive criticism, Sean Maleter.--Sean Maleter (talk) 09:47, 9 February 2008 (UTC)


Removed from Animal Rights section[edit]

I have removed a paragraph:

On the other hand, some have argued [citation needed] that modern science cannot determine exactly where sentience begins, going from bacteria to animals. This would pose considerable complications for a theory of unnecessary suffering. Others [citation needed] take no objection with the conclusion that it's wrong to cause unnecessary suffering, but contend that on this issue the moral concept of right/wrong shouldn't mirror human nature but should instead be modelled from nature. Since animals routinely kill each other and inflict (at times unnecessary) suffering on each other, then as part of animalia it wouldn't be wrong for us to also. This is a view most of the world's population follows, whether intentionally acknowledging it or not.[citation needed] Therefore, the reason the rules of nature regarding killing aren't applicable towards other humans is because we are then dealing with the human realm. Our own psychology and the collective sociology make it unfavorable (i.e. less safety, added stress, reduced efficiency) to partake in killing other humans. Seen in this light, it would not be speciesism to kill animals but spare humans, but instead an outgrowth of humans' (as a species) naturalistic adaptation while observing all natural ethics regarding suffering.

It has no citations, reads like a POV, and has no place within this section of the article anyway. Information on the philosophical points of sentience (with citations) would belong near the top of the article, but not within the AR or sci-fi or whatever sections, as this makes it look like only one aspect of non-human sentience is in question.

Arguments about animal rights itself (the "other animals eat each other" one, for instance) is way outside the scope of this article, just as anecdotes about robot rebellions would be. Noxic (talk) 21:25, 17 April 2008 (UTC)


Man traditionally forfeits his right to kill in order to receive protection from the state under its code of laws.


This article..[edit]

Needs a beatin'.

The evidence on this article as a whole only suggests that humans are able to make Wikipedia where they can argue in perpetuity the contradictions of their observances.

Don't expect some harbinger of light to descend down and lift this horrid article from the intangible and ridiculous. All that is at stake here are peoples own egos. I doubt someone will singlehandedly articulate the nature of reality for humans and nonhumans on this page anytime in our lives.

Might I suggest to everyone though some simple advice: Observe more. This article just verifies the fact we as humans have absolutely no-solid proof of reality or even more; no solid proof of our own intelligence.

Another practical example of a universal intelligence is acting harmoniously in nature; with compassion. I'm sure there will be a lot of surprise in the revelation that our world is much like a jig-saw puzzle where things may not be equal, but balance out because each shortcoming is filled by an extension of something else. There is nothing to be gained or lost from this stance.

Human superioricists arguing against any animal intellect are not arguing against nature itself because nature and the beings within it are unmoved by the rather dry convorsation going on here.

Its purely for our pride and ego, whether we are talking about nature or not. Were only talking about ourselves and the motivations behind our own desires. If you gave an animal a pencil and knowledge of literacy I'm sure the argument would be somewhat similar to ours; motivated by self-interest. I hope that clarifies some things for you all. --Overdose&YellowJacket (talk) 03:13, 6 January 2009 (UTC)

The fact that we are able to argue over this proves that we are sapient and not just sentient. Sorry, just had to make a point. 142.35.4.130 (talk) 18:35, 23 November 2010 (UTC)

Science Fiction and Sentience[edit]

I have made minor changes to the Science fiction section. Namely, included android in the list at the begining and changed the description of Star Trek's Data to "android" rather than "robot" as there is a distinction.

For reference: http://en.wikipedia.org/wiki/Android#Ambiguity http://en.wikipedia.org/wiki/Data_%28Star_Trek%29

Also, though I do not find it important enough to be included in the article, it is worthy of discussion what things might be given sentience in a work of science fiction. I would argue that the list in the article is far to short, as in my experience, science fiction writers, in their infinite search to challenge our preconceptions, have given sentience to quite a range of living, nonliving, and quasi-living things. Example: a bowl of petunias in Douglas Adams's The Hitchhiker's Guide to the Galaxy thinks "Oh no, not again!" before falling, and later is proven to actually be an incarnation of a sentient being. 202.220.175.152 (talk) 11:59, 8 May 2008 (UTC)

Kill the philosophy section?[edit]

The section on philosophy seems woefully incomplete. It only captures one (relatively recent and unimportant) thread in philosophical study of sentience. The animal rights, buddhism, and science fiction sections seem more or less on target for their fields. Should we just bag the philosophy section until someone shows up who is qualified to write it? ---- CharlesGillingham (talk) 21:33, 13 August 2008 (UTC)

I agree that the philosophy section is very incomplete. The New Mysterian point of view is, I believe, quite notable in its own right, but shouldn't be the only one mentioned. I don't think these are good reasons for deleting the section; rather for completing it. I myself could add a bit concerning some aspects — certainly not all — but I don't have the time right now. I think it's better to leave it as it is, perhaps with some tag or another (where do you find the list of all those Wikipedia tags?) to notify that it is lacking. David Olivier (talk) 16:13, 14 August 2008 (UTC)
Okay. I think I'll just rearrange the sections so that the most complete ones come first. ---- CharlesGillingham (talk) 16:46, 14 August 2008 (UTC)
You yourself (CharlesGillingham) appear to be a lot into AI. It would be interesting for you to add something from that point of view, even if it is that sentience does not exist, or is not a sound concept or something like that. David Olivier (talk) 16:16, 14 August 2008 (UTC)
Well, that's kind of what I did. Any discussion of "sentient computers", even in the AI community, tends to use the definition that used by science fiction. There is no other widely accepted definition. So the section on s.f. pretty much covers what there is to say about it for now. ---- CharlesGillingham (talk) 16:43, 14 August 2008 (UTC)

Difficulties of definition[edit]

This topic is a real tough one. I can see at least 4 meanings in use here:

  • Awareness of oneself and others as persons.
  • Bentham's "ability to suffer".
  • Jainism's 1-5 scale.
  • Reasoning ability, which I suppose includes the computational definition.

Re "definition that is used by science fiction" (previous thread), what exactly is that? I've often seen "sentience" = "reasoning ability" in SF. OTOH I've seen comments on David Brin's Uplift Universe that distinguish between sentience & sapience, so that the Tandu are sapient but at best marginally sentient, as they're a bunch of intelligent psychopaths. In [[Iain M. Banks]'s Culture stories the difference between Minds & drones vs lesser AIs is awareness of oneself and others as persons.

It will also be necessary to clarify Sapience, as that seems to think sapience = sentience. --Philcha (talk) 21:59, 13 April 2009 (UTC)

Yes, the topic is a tough one. However, I don't think that sentience corresponds to any of your four definitions. There is no reason to include in sentience the idea of recognising anyone as a person. Similarly, sentience isn't the ability to reason. I think the basic definition should be simply: the ability to have any feeling whatsoever. (Not necessarily to suffer.) On the definition level, we should not try to be more specific than that. How feeling is in turn defined, and what it amounts to physically, is a matter of dispute. But basically, I think the definition should be that. David Olivier (talk) 23:11, 13 April 2009 (UTC)
I have 2 concerns about your definition:
  • As you say, how feeling is in turn defined is a matter of dispute.
  • What matters are the sense(s) in which sources use the term. --Philcha (talk) 23:32, 13 April 2009 (UTC)
I haven't cited sources but neither have you. It can take time to dig up sources, but Tom Regan might be a good place to look. It is clear in his Case for Animal Rights that sentience refers to beings who feel, whether or not they have a concept of their own or others' personhood or existence over time. A snail might be sentient, he says, but not a subject-of-a-life, who has the understanding of his/her continued existence. As for the definition of feeling: actually, I don't think the definition of feeling is disputed; everyone knows “what it is” to feel. What is disputed is how feeling fits in with the physical world. That problem does not make defining sentience as a capacity to feel problematic in itself. David Olivier (talk) 07:39, 14 April 2009 (UTC)
It wouldn't be difficult to find sources - I've been an SF fan for decades, I've mentioned 2 series that raise the issues, so Googling for commentaries will be productive. However I can't afford the time to research and edit this article as my to-do list is quite large already.
I'm in favour of presenting animal rights aspects, provided it's WP:NPOV and does not crowd out other aspects - in fact as a regular GA reviewer I'd probably consider the article incomplete without some consideration of animal rights. However any presenation of animal rights here should be only a summary of good content in other articles, as the moral philosphy of the subject is complex, especially for organisms that can survive only by doing direct harm to humans. --Philcha (talk) 09:35, 14 April 2009 (UTC)
The issue here is not animal rights, but the definition of sentience. The fact that Regan gives a definition of sentience in a book about animal rights doesn't mean that that definition is valid only if you agree with him about animal rights! It is a legitimate example of how the term is used and defined. And I think it's somewhat more valid than random quotes from SF literature. David Olivier (talk) 12:34, 14 April 2009 (UTC)
"somewhat more valid than random quotes from SF literature" is arrogant, and makes no more sense than anthropocentric definitions of anatomical terms that apply to a much wider range of animals.
I'm too busy to research this now, but if you try to take over this page and / or remove other senses of "sentient", I'll research it, reference it, re-insert it and report you for POV-pushing and WP:DE. --Philcha (talk) 13:11, 14 April 2009 (UTC)
Oh my! David Olivier (talk) 14:31, 14 April 2009 (UTC)

I think it is obvious that there are several definitions in use. This article just lays out each definition. The first paragraph gives the dictionary definition, and then summarizes the other sections. The other sections each give a slightly different definition and identify the field that uses it. This is an appropriate way for Wikipedia to handle this kind of article. It gives all the definitions in use, for WP:Comprehensiveness and WP:Neutral point of view. It doesn't let one or another definition have precedence. This article is far better off than, say self-awareness (which is a simply terrible article), embodiment (which leaves out every definition for which there is no article), consciousness (which uses the neuroscience definition and completely ignores the new age definition). I'm not sure I see a problem that needs to be solved here. ---- CharlesGillingham (talk) 05:24, 15 April 2009 (UTC)

I'm concerned about the preciseness of the language in the current definition where: "Sentience is the ability to feel or perceive subjectively." And I have the impression that "Sentience is the state of feeling or perceiving, subjectively." I have the "ability to" dance however I cannot choose whether or how I perceive "red". That is: I have the capacity to dance but those wave lengths of light striking my eyes and the subsequent electrochemical signals being interpreted by my brain, I call "red". Since red is a function of consciousness and it is not clear that I can perceive red if I'm unconscious or have my eyes closed, that ambiguity in the definition can be avoided with present perfect continuous or simple present tense.Grimsooth (talk) 17:42, 24 November 2009 (UTC)

I'm chiming in here because I too have issues with the definition. It may help to form a WP:concept cloud of all relevant concepts here, beginning of course with sapience and its distinction. It seems to me an essential part of the animal rights argument that animals have a being (in its subjective meaning) ie. they have a certain higher consciousness beyond their physical senses such as to be "beings" (in its objective sense).
And yet it seems to me that the definition of "sentience" vastly transcends the senses, and has a meaning which suggests intellect and self-awareness that transcends the level of mere animals. The animal rights argument is that animals (certain ones anyway) are not just biological organisms, but "beings." I'm not certain that such a fringe argument as "animal rights" should have such sway and influence here such as to augment the very meaning of "being" and "sentience."
Can we regard the "animal rights" argument as simply a fringe one, and then deal with those arguments as we do with other "fringe" ideas? Regards, -Stevertigo (w | t | e) 22:58, 3 May 2010 (UTC)
I do not think that we can regard the animal rights view as "fringe", since they have been using the word cogently for a very long time and it is central to their argument. The newer definitions, coming from Buddhism/Jainism/New Age or from science fiction, are also a bit "fringe", in my view. Animal rights at least defines the term without resorting to vague synonyms for the "self" or "soul".
I have re-edited the introduction. Please take note of the edit comments, where I argue point-by-point for each change. If you disagree with one of my points, please feel free to discuss it here. ---- CharlesGillingham (talk) 00:05, 8 May 2010 (UTC)

(break)[edit]

  1. Charles writes (here and in comment lines): "I do not think that we can regard the animal rights view as "fringe", since they have been using the word cogently for a very long time and it is central to their argument." - Please explain what you mean by "cogent," "very long time," and "they." Under only the rarest circumstances would I accept an argument wherein its foundation is so entwined with its own premise. Arguments for animal sentience (consciousness of self, self-awareness) are notably limited by attempts at language acquisition, for example, so your premise of giving that POV such sway cannot be held beyond a tertiary remit.
  2. "Animal rights at least defines the term without resorting to vague synonyms for the "self" or "soul"." - "Self" is not a "vague synonym." It is a concept of philosophy, philosophy of mind, psychology, religion, etc. that indicates that the individual has a concept of it...self.
  3. "Sentience may not strictly be a capacity of organisms, and this is not part of the definition. It may be possible for inorganic machines to be sentient." - This is unfortunate. Even were it possible for a machine to gain sentience, that machine would thus have gained a being, which is a backwards way of saying your machine is now also an "organism," though not by necessity a biological one.
  4. "Defining "sentience" in terms of "being" goes beyond the dictionary definition of "sentience" . Also, "being" is even more vague that "sentience"." - "Being" is also not vague. See the being article for a basic overview. I may agree that the definition I wrote seems to go beyond the definitions we are finding on online dictionaries and so forth, but if you'll notice, they all typically mention consciousness. And they do so in such a vague way as to make the ambiguity you note quite understandable.
  5. "Sentience has not be quantified at this time, to my knowledge." - There is an article called sentience quotient, which involves a scientist of some kind attempting to find a correlation between information science and neurophysiology.
  6. Charles, note here how much stripped down the article is with your current definition —one which I would call vague and innaccurate. By using words like "self" and "being" it becomes not difficult to write more definitively, and not in the way philosophy articles tend to unravel. Regards, -Stevertigo (w | t | e) 03:16, 8 May 2010 (UTC)
  1. "They" is the philosophers who have made the case for animal rights, including Rousseau, Bentham, etc. The "very long time" is at least to the 18th century. (See Animal rights#18th century: The centrality of sentience, not reason). (Unless you count Buddha, who often discussed "the suffering of sentient beings" in the first millenium B.C., but of course this is a modern translation.) "Cogently" could be considered a matter of opinion, I suppose, but at the very least they have always made a precise distinction between "reason" and "sentience" and so, for them, the word has always had a precise meaning. ---- CharlesGillingham (talk) 10:50, 9 May 2010 (UTC)
  2. I think having a "concept of yourself" is called "self awareness" and (some would argue) this is slightly different than sentience. I think that sentience has a precise meaning, that does not necessarily include self-awareness as part of it's definition. It may be true that all (human) beings and all metaphysical beings are sentient, conscious, self-aware, intelligent and so on, but that doesn't mean that the words all mean the same thing. It's only when we begin to look at unusual cases that the definition begins to be important. A rose is a "sentient being", by Jainism's definition, but it is probably not conscious or self-aware or intelligent, unless you redefine these words as well. As for being, It is very arguable exactly what is a being and what isn't a being — some believe that the stars have a "self" of a sort. This is how it is vague. Sentience, on the other hand, is defined much more precisely. ---- CharlesGillingham (talk) 10:50, 9 May 2010 (UTC)
  3. "Non-biological organism" is a metaphor at best and an oxymoron at worst. If you're going to redefine every word to mean the same thing, what's the point of having different words? ---- CharlesGillingham (talk) 10:50, 9 May 2010 (UTC)
  4. The Oxford English Dictionary gives the primary definition as this: that feels or is capable of feeling; having the power or function of sensation or of perception by the senses. I think consciousness is worth mentioning, since sentience is certainly an aspect of consciousness. (If we have subjective perceptions, we are "conscious" of something, and therefor we "have" consciousness.) I think "being" is getting too far away from the definition: the identity of being, self, consciousness and sentience is something which needs to be proven, not something we can simply assert at the start by definition. ---- CharlesGillingham (talk) 10:50, 9 May 2010 (UTC)
  5. You do realize that the article on the "sentience quotient" defines sentience in term of information processing capacity per unit mass, right? This is a fringe definition of sentience, and about as far from a definition in terms of "being" as you can get. ---- CharlesGillingham (talk) 10:50, 9 May 2010 (UTC)
  6. I'm not in favor of saying "no one knows what it means". I think the dictionary definition is very clear. Forgive me for being pedantic but just for review: "that feels or is capable of feeling; having the power or function of sensation or of perception by the senses". There is nothing there about the "self" or the "soul". The connection to the self is that perception is central to consciousness and consciousness is central to the self. If perception is only truly possible for a conscious being, then all conscious beings are sentient and all sentient things are conscious beings. But even this does not imply that the words "sentient" and "having a self" mean the same thing; they are not identical by definition. (Even if the words do, in fact, have the same denotation, they still retain a different connotation). We need to give the precise definition first. Then we can describe how various fields have connected sentience to consciousness, intelligence, rights and being. ---- CharlesGillingham (talk) 10:50, 9 May 2010 (UTC)

(break)[edit]

  1. Charles writes: ""They" is the philosophers who have made the case for animal rights, including Rousseau, Bentham, etc." - Can we boil down their concepts about sentience to an essence and not elevate them beyond the views of others? There are perhaps hundreds more philosophers who have written about "sentience" that use the term precisely in the way animal rights activists (not philosophers) do not, meaning to distinguish human from animal. There are many more notable philosophers Aristotle, Descarte, etc. to get to before we elevate anomalous concepts to first place here. Charles writes: "Unless you count Buddha, who often discussed "the suffering of sentient beings" in the first millenium B.C.," - Many consider Buddha an excellent source. ""Cogently" could be considered a matter of opinion, I suppose.." - Yes, that's one problem with superlatives. "..at the very least they have always made a precise distinction between "reason" and "sentience"" - I will settle for their very best, for example those inventions which are not anomalous, but rather are ones under which we live by today. In Bentham's case he apparently devised some particularly Orwellian type of prison. -SV
  2. Charles writes: "I think having a "concept of yourself" is called "self awareness" and (some would argue) this is slightly different than sentience." - First of all the term "self" has vastly more usage than which could be boiled down to a term like "self awareness", so your premise here of reducing "self" to its usage in another term is unsupportable. One of the primary concepts in philosophy is that "self" transcends what falls under "self awareness." We are not always aware of ourselves, nor are we aware of all that is in "self." If you can either destroy my premise that we need a word, in the context of discussing "sentience," for how "a being" considers its own being, or else find a better word for it ("it" being "self")...-SV
  3. Charles writes: "I think that sentience has a precise meaning, that does not necessarily include self-awareness as part of it's definition." - I agree. But see above answer - do you see now why if I suggest using "foo" in a concept, reducing "foo" to "foo + bar" and then saying "there is no "foo bar" in "[concept <relating to> foo]" is improper reasoning. Charles writes: "It may be true that all (human) beings and all metaphysical beings are sentient, conscious, self-aware, intelligent and so on, but that doesn't mean that the words all mean the same thing." - I agree, but you again are assuming that "self awareness" (ie. "foo bar") is a more precise term than "self" (ie. foo). It is not. Charles: "It's only when we begin to look at unusual cases that the definition begins to be important." - You are confusing "anomalous" with "important." Charles "A rose is a "sentient being", by Jainism's definition, but it is probably not conscious or self-aware or intelligent, unless you redefine these words as well. - A rose cannot be a being - it has no brain and therefore it has no mind. Even if one entered threw Gaia concepts into the argument, "sentience" would not be a property of the rose, but of an all being that vastly transcended the "being" of the rose. Charles: "As for being, It is very arguable exactly what is a being and what isn't a being — some believe that the stars have a "self" of a sort. This is how it is vague. Sentience, on the other hand, is defined much more precisely. - This argument, perhaps best called a "some people [think]" argument, is not convincing. -SV
  4. Charles writes: "Non-biological organism" is a metaphor at best and an oxymoron at worst." - Actually it isn't. Charles: "If you're going to redefine every word to mean the same thing, what's the point of having different words?" - Only if you use the "bio-" in "biological" to indicate a meaning of "life" itself which in exclusive (and pre-emptively so), does it parse that way. The way in which I used "..not by necessity a biological one" was to indicate by irony the term "biology"'s inherent dogma that all life forms are by necessity housed forms crafted through naturally material processes. If you are a deist, or a supporter of AI research, you naturally consider the concept that "life" might exist outside of "biology." "Organism" by now you have figured out, in its most abstract meaning simply means an entity that has some kind of organised living processes going on - artificial, "biological" or otherwise. - SV
  5. Charles writes: "The Oxford English Dictionary gives the primary definition as this: that feels or is capable of feeling; having the power or function of sensation or of perception by the senses." - Websters has a different definition and it is likewise anomalous with respect to consciousness: "a) a sentient quality or state b) feeling or sensation as distinguished from perception and thought." (sentient: sentiens, sentire 'to perceive, feel,' Date: 1632, 1 : responsive to or conscious of sense impressions <sentient beings> 2 : aware 3 : finely sensitive in perception or feeling.) Keep in mind that dictionaries at best try to collect information on how a word is used. From an language purist point of view, a word is nothing but how it is used, and hence none have by themselves a particular definition. My sense is that our time here, working on an actual understanding of this concept, is not wasted.
  6. Charles writes: "I think consciousness is worth mentioning" - Im glad we agree, even though there are definitions we are finding that concern only "sense". How they define "sense" I wonder.." Charles : "I think "being" is getting too far away from the definition: the identity of being, self, consciousness and sentience is something which needs to be proven, not something we can simply assert at the start by definition." - I disagree. We do not have proof of our own "consciousness," for example, except that which we see with our own minds. Inserting "proof" here is a monkeywrench. -SV
  7. Charles writes: "You do realize that the article on the "sentience quotient" defines sentience in term of information processing capacity per unit mass, right? This is a fringe definition of sentience, and about as far from a definition in terms of "being" as you can get." - I agree about it being fringe, though I understand his intention: To create an imaginary yardstick for an underlying biological process as a way to claim a definitive meaning for a debated term. Im being a bit facetious here: I understand how someone consumed with a "sense" definition of "sentience" would look at neurons (though that might be unfair to say).
  8. Charles writes: "I'm not in favor of saying "no one knows what it means". I think the dictionary definition is very clear." - Well can we agree that, per Chomsky and other linguists, dictionaries tell us little or nothing about what a word means. The word "water" is only approximately H20. The word "sentience" has been used in language in a way not found in dictionaries, such that it has come to mean 'something qualitatively important' in the perception of the value of life - usually the lives of others ;-) (ie. our own lives we presume to have value). Charles: "Forgive me for being pedantic but just for review: "that feels or is capable of feeling; having the power or function of sensation or of perception by the senses". There is nothing there about the "self" or the "soul". - Websters and other dictionaries refer to "perception" - a term that cannot, in my mind, be extricated from "consciousness," and delegated just to "sense." Likewise, when we talk about "sense" in the context of animal sentience, etc., we are using the term in a way that indicates something important with respect to our concept of self. Its an abstract argument, but just as a "vague" word such as "love" has something to do with the Ethic of Reciprocity, so too does our attempt to conceptualize "self" have something to do with how we try to conceive of others. The word "sentience" represents an attempt to connect human reciprocity to the world of the beasts, via some commiserative or sympathetic insight that both denies the equivalence between man and beast, but likewise extends the ethic of reciprocity to the lesser. Charles writes: "The connection to the self is that perception is central to consciousness and consciousness is central to the self." - We agree that there are exactly two concepts in the loop between "consciousness" and "self." That the third might be "perception" I am not qualified to say. I will say its a pretty damn good guess, and its probably where I was going with the Webster's definition. Charles writes: "If perception is only truly possible for a conscious being, then all conscious beings are sentient and all sentient things are conscious beings." - I agree that we hold those words consciousness, sentience, and perception to a high standard, such that we have exclusionary properties associated with the beasts - hence, a person who is a "pig" has mastered these three to a lesser extent than someone who is not a "pig." Whether someone who is a "pig" has greater sentience than someone who is a "dog" I dunno. (PS: I didn't notice you said "sentient things." Ha. While we may agree that pigs can play video games, I would refrain from saying my salt and pepper shakers are "sentient.") Charles writes: "But even this does not imply that the words "sentient" and "having a self" mean the same thing; they are not identical by definition. - I never said they mean the same thing, only that they by communion of concept form a golden ring - a twisted string if you will. Charles writes: "Even if the words do, in fact, have the same denotation, they still retain a different connotation). We need to give the precise definition first. Then we can describe how various fields have connected sentience to consciousness, intelligence, rights and being." - OK. We are making progress. Now you just have to retire your own definition of "precise" and substitute it with mine, and we will be on our way. -Stevertigo (w | t | e) 21:16, 9 May 2010 (UTC)
You've brought up some interesting points, but nothing that can dissuade me that "sentience" is "the ability to feel". This is what the word means. I notice that SlimVirgin has already restored the first paragraph with the proper definition, so I hope this issue is resolved for now. ---- CharlesGillingham (talk) 22:07, 19 May 2010 (UTC)
The problem with going with the standard definition alone is that definitions may not give a sense of the meaning of a term. In this case the word "sentience" has two definitions, "feeling" and "perceiving." Which is it? It's in fact neither, but a combination of them both in a deeper way than is apparent from neither the definitions of "feeling" nor "perceiving." Words often have a meaning which has developed through culture. Its what culture says that actually gives a deeper sense of a word's meaning. I think a problem here is that the article treats the subject sectionally without actually putting it together and adding it up. Definitions related to "feeling" seem to dominate some usages, while definitions related to "perceiving" dominate other usages. Im currently reading Sentience, the preview anyway, by Matson. The question is or not the word's usage is meant —ie. its meaningdistinguishes man from the beasts vis a vis "perception" (and its higher aspects) or associates them vis a vis "feeling," or "senses." -Stevertigo (w | t | e) 16:19, 20 May 2010 (UTC)

the Catholic Encyclopedia touches on this subject[edit]

http://www.newadvent.org/cathen/04542a.htm 24.191.87.42 (talk) 02:15, 7 April 2012 (UTC)

Science fiction definition[edit]

What would be an acceptable source for that definition? The Star Trek:NG episode where Data attempts to receive the rights of a human as a sentient being. The definition there is:

  1. Intelligence
  2. Self-awareness
  3. Consciousness [1]

Gregory Benford, Greg Bear and David Brin also define sentience in their books. Asimov deals with sentience in his Nemesis (Isaac Asimov novel) novel and the Foundation series Sheldon's plan has an axiom of "that Human Beings are the only sentient intelligence in the Galaxy." There have got to be science fiction authors that debated the topic in some front entry of Asimov's or Fantastic Stories. These are harder sources to locate by web search. 97.85.168.22 (talk) 02:29, 15 August 2012 (UTC)

Here is some more sources:

Oxford Dictionary of Science Fiction:
sentience (n.)
1. an intelligent being. Compare sapient, sentient, sophont.

    1947 G. O. Smith Kingdom of Blind Startling Stories (July) № 48/1: Secondly, the true schizophrenic paranoid cannot rail against a mechanistic fate. He must find some sentience to fight, some evil mind to combat.
    1991 T. Bisson They're Made Out of Meat Bears Discover Fire & Other Stories (1993) № 35: First it wants to talk to us. Then I imagine it wants to explore the Universe, contact other sentiences, swap ideas and information. The usual.
    1999 I. MacDonald Days of Solomon Gursky № 253: PanLife, that amorphous, multi-faceted cosmic infection of human, trans-human, non-human, PanHuman sentiences, had filled the universe.
    2002 U.K. Le Guin Social Dreaming of Frin Mag. of Fantasy & SF (Oct.–Nov.) № 185: The duty of the strong-minded person, she holds, is to strengthen dreams, to focus them [...] as a means of understanding the world through a myriad of experiences and sentiences (not only human).

2. intelligence. Compare sapience.

    1954 P. Anderson Brain Wave № 120: Corinth's memory went back over what he had seen, [...] the life which blossomed in splendor or struggled only to live, and the sentience which had arisen to take blind nature in hand. It had been a fantastic variety of shape and civilization.
    1969 A. McCaffrey Dramatic Mission Analog SF — Sci. Fact (June) № 64/2: It was a convention among all the sophisticated societies she had encountered that sentience was not permitted to waste itself. Kira Falernova had found it excessively difficult to commit suicide.
    1991 D. Stabenow Second Star № 180: The Librarian had awakened Archy to sentience, to self-will, to an intelligence unprogrammed, unsupervised, not of human born.
    1994 L. A. Graf Firestorm № 23: Spock, have you started checking out the Johnston observatory's sentience report?
    2001 D. Gerrold Bouncing off Moon № 307: Intelligence exists as the ability to recognize patterns. Self-awareness is intelligence recognizing the patterns of its own self. Sentience is the ownership of that awareness — the individual begins to function as the source, not the effect of his own perceptions.

97.85.168.22 (talk) 02:42, 15 August 2012 (UTC)

'Animal rights and sentience' quality[edit]

The Animal rights and sentience section of this article is very poorly written. It is fragmented and poorly assembled. 74.43.60.101 (talk) 20:17, 7 November 2012 (UTC)

How is it related to oncology[edit]

How is this related to oncology? Should be related and included in article. --Inayity (talk) 09:03, 9 November 2012 (UTC)

Aspects of consciousness[edit]

"Other philosophers (such as Daniel Dennett) disagree, arguing that such aspects of consciousness do not exist." Which aspects? It's not clear whether this refers to aspects of consciousness other than sentience, or aspects of consciousness that cannot be explained. Clarification is required. Lhimec (talk) 19:16, 7 March 2014 (UTC)

Clarified this. ---- CharlesGillingham (talk) 04:35, 9 March 2014 (UTC)

Bias and subjectivity of definition of sentience[edit]

I looked up the definition of sentience and "the ability to feel" is not what came up.

Merriam Webster: Sentience: feeling or sensation as distinguished from perception and THOUGHT. Thought: the action or process of THINKING Thinking: the action of using your mind to produce ideas, decisions, memories, etc.

Dictionary.com: Sentience: having the power of perception by the senses; CONSCIOUSNESS. Consciousness: awareness of one's own existence, sensations, thoughts, surroundings, etc.

The scientifically respected turing test compared to this article about sentience conflict greatly.

There is not only a bias in the article leaning toward an animal rights agenda, but there are far too many lines fixated on that agenda instead of substantial scientific fact and theory.

I recommend a complete re-write of this article that is fixated on the science of sentience, not the opinions of it. — Preceding unsigned comment added by 50.0.107.211 (talk) 16:37, 5 June 2014 (UTC)

You found these definitions:
feeling[1] or sensation[2] as distinguished from perception and thought[3]
having the power of perception by the senses[2]; CONSCIOUSNESS[4]
We have almost precisely the same definition in the article:
Sentience is the ability to feel,[1] perceive,[2] or to experience subjectivity.[4] Eighteenth-century philosophers used the concept to distinguish the ability to think (reason)[3] from the ability to feel (sentience).[1] In modern Western philosophy, sentience is the ability to experience sensations (known in philosophy of mind as "qualia").[4] The concept is central to the philosophy of animal rights, because sentience is necessary for the ability to suffer, which is held to entail certain rights.
So I'm not really sure what you are talking about. ---- CharlesGillingham (talk) 07:21, 11 June 2014 (UTC)
    • ^ Cite error: The named reference feel was invoked but never defined (see the help page).
    • ^ Cite error: The named reference perception was invoked but never defined (see the help page).
    • ^ Cite error: The named reference not_think was invoked but never defined (see the help page).
    • ^ Cite error: The named reference consciousness was invoked but never defined (see the help page).