|WikiProject Psychology||(Rated Start-class)|
|WikiProject Philosophy||(Rated Start-class)|
POSSIBLE COPYRIGHT VIOLATION
I have written a new article at the temporary subpage while waiting.
The main phage about this article on Sentience Quotient says:
The previous content of this page appears to infringe on the copyright of the text from the source(s) below:
But it is Robert A. Freitas Jr. which owns the copyright, not lycaeum.org. As Robert A. Freitas Jr. says;
"I am indeed the originator of this idea, and this formula. The phrase "sentience quotient" and the associated formula first appeared in my Analog article on "Xenopsychology" back in 1984, although I originally conceived the idea circa 1977 while writing my book Xenology. The copyright for the Analog article is owned by me.
It is lycaeum.org that is in violation of my copyright. I have attempted to write to them in the past, insisting that they give me proper credit for this usage at their website, or alternatively to take down the material, but they have never responded. I would appreciate it if you would delete the infringing reference and URL to their material and cite only my Xenopsychology article in your Wikipedia posting, as this is the original and only true source. It is my wish that they not be given any undeserved publicity for their infringement -- until and unless they clean up their site by inserting a proper reference to my work. I believe lycaeum.org may have cribbed the material in question from a 1988 issue of Daedalus, which may or may not have included a reference back to my original material. The Daedalus material was not online, the last time I checked.
BTW, I now have revised my Xenopsychology page to make the infringement clear to any visitors."
- The text itself is rediculously small, (in fact it is little more than a simple equation) and because of this I imagine those in charge at lycaeum.org probably believe it is small enough to be considered fair use. I tend to agree. Can you copyright an equation? Even if you can, what purpose would it serve to do that? -- 126.96.36.199 14:36, 26 August 2005 (UTC)
- Reply by RAFJr: The primary issue here is not with their fair use, but rather with their failure to simply attribute this concept to its actual creator. Had they done this (i.e., added about 4 words to their website), I'd have no complaint whatsoever with their usage. [RAFJr 18 Sept 2005]
How political correct is it possibble to get? This theory is a formula based on math and physics, so why is it compared with phrenology and other pseudoscience? It is not a theory that are meant to be used on races or individual humans, but as an idea on how fast or slow a bran in theory can be.
"The theory is controversial because it defines sentience according to a relationship between information processing rate and brain mass, yet there is no evidence that such a relationship is in any way related to a measurement of sentience."
Well, information processing rate, design, size and programming as a whole has anything to do with it. If we were talking about computers, would there still be any problems with the SQ? If you look at these elements one by one, you can't tell how fast a creature can process information. But all the elements together will tell a lot.
No offence, but I suspect it is compared with phrenology because lack of understanding and a little too much political correctness.
- I totally agree this need not be under pseudoscience. If the theory 'defines' sentience in a certain way, you can simply assume that this is a term not necessarily implying equivalence with sentience, but simply introducing a variable that is used as a numerical model. I do not see any problem.
Agreed. I believe a major obstacle regarding the use of such a construct would be the practical applications of purely hypothetical neural relationships. Moreoever, the lack of sophisticated processing systems in our contemporary world relegates this concept as something of a far-reaching exploration of cognitive possibilities. The concept of a definable, falsifiable Sentience Quotient will be possible as our understanding of silicon-based systems improves.
The idea of a quantifiable "Sentience Quotient" isn't necessarily the same substance that directly fuels the engines of science fiction; rather, the measurement of abstract intelligences beyond the scope of simple organic functioning is a hefty equation that requires intensive analysis into perspectives we have yet to identify.
This concept is debatably akin to the introduction of photosynthesis into humans via cyanobacteria. Again, its an ideal that is too advanced to analyze with a truly skeptical mind.
Thanks, Aaron. 9.29.2006
I deleted the part about criticism because it only focused on the information processing rate and brain mass, while ignoring design and programming. Secondly, this is a theory, not a claim, and before there is enough data to support that it is correct or incorrect, it can't be described as pseudoscience. And if we are talking about the processing rate in single celled microbes, then we are talking about a different subject. 188.8.131.52 14:53, 19 January 2007 (UTC)
Dr. Freitas is a computer engineer who was toying around with the human brain's efficiency. Being more engineer than psychologist, its only natural that the interpretation of brain efficiency would be mathematically based. From a physics point of view, the "political correctness" issue is a moot point because, if anything, what the SQ equation does is basically shoot down virtually all racialist intelligence theories ever made. An SQ of 13, according to the results of that equation, apply to all humanity, because every race on this planet has adapted their bodies for maximum cognitive efficiency, meaning, if we define intelligence as data crunching in accordance to what data crunching a given society considers important, virtually everyone has the same intellectual potential according to this equation which, if I understand correctly, makes no distinction between someone with an average I.Q. of 100, and someone like Stephen Hawking with an I.Q. of (I believe) 280.
All, people, this equation says, ultimately is "the human brain is this efficient; here's the math." That's all it really says. Clearly, "smart" people have trained their brain to make good use of that SQ 13 potential. With that kind of data efficiency though, hypothetically, human intellect can reach as far as an I.Q. of 500. Scary to think about but, if I interpreted the math of the SQ equation correctly, it is not mathematically impossible. An alien race with an SQ of 50 has intelligence and data processing enough to memorize, crunch, and store all bits of information on this planet, every book, every library, every internet database, ever written, being written, of all time. Including every bit of information stored in every cell in our body. You can't even apply the whole concept of "I.Q." to such an alien race. Considering the speed of a supercomputer, which has an SQ of 9, that probably means that, from a physics perspective, an "I.Q." of 500 is not impossible. Most offensive thing about the equation, is the equal potential of every human being.
From a structural perspective, the brain is already at its "maximum potential." Here's the kicker though; not everyone on this earth makes use of that potential. If Dr. Freita's equation is correct, through the use of a special training regiment (started, preferably, from childhood) it is possible to create a society where everyone is a Promethean. Hypothetically.
- Oh well, it's obvious that the concepts of intelligence, as used in psychology as an empirical science, and as perceived by non-psychologists, differ.
- But at first, where's the neurobiology? The guys researching that stuff conduct fascinating research, please be so kind as to not insult them that way.
- Second, the differences in intelligence, and thus in IQ (or as you put it: "how well one uses his SQ of 13"), between people should be seperated from SQ, which, as I understood, is about the data processing efficiency.
- As intelligence, depending on definition and particular theory, consists of more reaction to basis stimuli and may include elaborate problem solving strategies, comparing IQ and SQ is like comparing a measure of whole computer performance (software running on a specific hardware, thus human intelligence) with a measure of hardware alone (brain efficiency, SQ).
- To make it more dramatic: a computer with a SQ of 6 won't do anything if switched off (like a human with a possible SQ of 13, but being very very stupid...or sleeping or whatever), but that wouldn't alter the SQ value of that particular machine.
- Third, your "guess" of possible IQs tells me you could benefit from reading Wikipedia's article about IQ. For short, it transforms the score achieved in an intelligence test (with normal distribution) into a scale with (I hope I get the terminology right in English...) a mean of 100 and a standard deviation of 15. So if you are one standard deviation above mean, your IQ is 115 and thus higher than the IQ of 84% of all people, 16% score higher than you. For SD=2, IQ=130, your IQ is higher than that of 98% of the population, only 2% score higher. For SD=5, IQ=175, you're in the top 0.000029%, that wouldn't be 2000 men and women on earth. For SD=7, only 1,28e-10% are better (which is the area under the curve still left out, btw), so you're better than (100- 1,28e-10)%. SD=8, and you're part of the best 1e-15. Well, we have 6,6e+9 human beings we might test...taking the math literally and assuming reliable tests for that purpose.
- Ok, now, an IQ of 500 equals 26 two third standard deviations above mean, which is...um...put frankly...that won't happen.
- Reason: that's not linear and stuff, you know? Hmmm. MH —Preceding unsigned comment added by 184.108.40.206 (talk) 04:27, 11 May 2008 (UTC)
Measurement of Bits per Second and Instructions per Second
In a similar vein to some of the above, I think it should be noted that the capabilities of a computer system are not necessarily solely defined by the number of bits or instructions that it can process per second.
Parallelism may be a factor, as demonstrated by the use of GPGPUs for some tasks. Also, information is not necessarily implemented in binary. Analog computers use continuously varying voltage differences.
I added a link to a 1997 article by Hans Moravec that concludes 100 million MIPS, or 100 trillion (1E+14) instructions per second, based on visual processing of the human retina compared to that of computer systems with known IPS rates. Moravec's figure is one order of magnitude off from the 1E+13 stated in the "Xenobiology" paper. Freitas does not state how he derived his first premise that "one neuron can process 1000-3000 bits/sec". The discrepancy demonstrates that there is not a universally agreed measure of human brain processing speed.
Also, the bits-per-second performance of a visual display system is not necessarily indicative of the ability of the system (e.g. the brain) to perform a wider variety of tasks. It may be that more "intelligent" processing depends on other factors in addition to BPS/IPS. Very good electronic cameras can effectively replicate human vision, but computers have yet to replicate many other aspects of cognition, including those commonly associated with "sentience".
- This criticism seems a bit off the mark. "Xenobiology" never states that the brain's processing power is 1E+13 of either bits or instructions per second (whatever the latter might mean here). Rather, since Freitas states that a processing power of a bit over 1E+3 bits/s per neuron, and the human brain contains a bit under 1E+11 neurons, then clearly the implicit claim is that the aggregate maximum processing power is around 1E+14 bits/s instead. I do not know how Freitas came about his 1E3-3E3 bits/s figure for a neuron, but it seems to be coincident with upper estimates of maximum firing rates of neurons (most typically firing at an order of magnitude of a more sedate pace).
- As to the rest, Freitas also explicitly acknowledged that this is not a measure of actual intelligence, but rather a maximum possible intelligence given a physical size (mass) of the brain. Parallelism increases intelligence by increasing the _size_ of the computational device, so of course it does not affect SQ (since it's a per-mass measure), and is therefore completely irrelevant here. Lastly, there is no claim that there information has a binary implementation; modern information theory can quantify it regardless (the difference being that the measure is no longer constrained to be an integer). Stan Liou (talk) 11:50, 23 November 2009 (UTC)
I don't buy that there can be a maximum SQ. Why can't there be a computer that takes longer than 10^18 seconds to process a bit? I'd like to see a source on that number. Rm999 (talk) 13:31, 13 April 2008 (UTC)
- It's the minimum SQ you're refering to? Well, it just was assumed that the time to process a bit was around 13.7 billion years, the approximate age of the universe, as slower processes could be imagined, of course, but wouldn't fit the idea of "theoretically possible practical slowness", so to say :-)
- 13.7 billon years equals 4.3×10^17 s, then 10^18 s gives you some extra time. MH —Preceding unsigned comment added by 220.127.116.11 (talk) 03:35, 11 May 2008 (UTC)
- Whoops, yeah I met minimum. I still don't agree with that reasoning - for example, maybe the yes or no answer to the question "was the Universe worth existing?" is currently being calculated by the Universe itself, and requires another 10 trillion years to be computed. There is no hard limit to how long it can take a bit to process. Rm999 (talk) 07:36, 3 June 2008 (UTC)
Practicality and common sense would suggest that if it took longer than the age of the present universe, then it has never processed a single piece of information, and there's no way of knowing how long it will eventually take 18.104.22.168 (talk) 12:20, 1 August 2008 (UTC)
if the mass of a neuron was 10^-10 kg then the 100 billion neurons of the brain would weigh a total of 10 kg. the average weight for the adult brain is 1.3kg-1.4kg. 80% of that is water, not all of the rest is neurons. That gives us an estimated the neuron A sentience quotient of ADLEAST +15, a hundred times better than +13. —Preceding unsigned comment added by 22.214.171.124 (talk) 12:52, 8 October 2010 (UTC)
I hypothesise that the total combined computational/brain/organic power of the entire multiverse was approximately 2000 sq as of 2008. This is possible because planck's constant doesn't apply everywhere. Don't delete this any time soon. Knowledge from beyond.
Makes no sense
Extraordinary claims require extraordinary evidence. Unless someone can cite some very reliable sources which explain why that makes sense, saying that a brain neuron can process 1000-3000 bit/s seems pretty ridiculous. AFAIK, the human brain is not a digital computer and does not run on bits at all. May as well say an Apple has 1000-3000 Orange Units, if you catch my drift. — Preceding unsigned comment added by 126.96.36.199 (talk) 21:28, 7 November 2013 (UTC)
Analysis, misconceptions and criticism
I've added this section as I wasn't sure about the original definition as has been published nor did I want to mess up anything else. It's more like pointing out the problematic parts of the definition as well as the about generalization of SQ. Feel free to modify and integrate it into the article body. There are generally many misconceptions and errors on this talk page I feel the need to address them.
As to bits and bytes there seems to be some confusion among others between information coming from sensory organs and data storage capacity and actual processing power. Simply said any signal is an analog sensory information bit, any synaptic connection is a data storage bit. Note these can even be found in textbooks since the 90's. Synapses are considered storage data bit because signal pathways change through learning. Or consider the permutation of all signals as a single memory state of the brain in a single cycle. All sensory information is obviously processed in real-time, hence, the amount of information processed is the processing power of the brain. However, this does not translate 1 to 1 to digital computer processing power. In order to process something a computer needs to run an operating system and software or program. The program needs to execute an amount of operating code (instructions) per raw data. The efficiency of this code determines how much more computer power is required to match a biological brain. The actual amount of sensory information a human brain processes is far larger than usually given.
The supercomputer example above appears to be wrong as it seems to take in the total body mass and not the brain mass (chips). Take the Haswell Core i7-4770K as an example with a die size of 177 mm^2, an estimated weight of 0.3402 g and 118.91 GIPS in turbo, 249.6 GFLOPS DP/ 499.2 GFLOPS SP. This gives a raw SQ of 14.5, a SQ of 12.7 for simple signal processing, and a SQ of 10.2 for complex signal processing. Haswell is by far not designed for signal processing nor is it the fastest.
As for intelligence see Ability to Ignore Distractions Correlates With Intelligence, Study Says. Here, intelligence is the ability to filter the important information from a huge amount of information. Now, that's a measurement for real efficiency or as implied real intelligence. It is obvious there's a fixed ratio between relevant to irrelevant data in a certain data set, hence, there's an asymptote of maximum efficiency regardless of brain design. Therefore, relative intelligence should be measured as quickness of perceptibility. A measurement of how does a brain does understand or pick something up. Mightyname (talk) 16:39, 23 August 2014 (UTC)