Jump to content

Cognitive computer: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Line 7: Line 7:
Intel's self-learning neuromorphic chip, named Loihi, perhaps named after the Hawaiian seamount [[Loihi]], offers substantial power efficiency designed after the human brain. Intel claims Loihi is about 1000 times more energy efficient than the general-purpose computing power needed to train the neural networks that rival Loihi’s performance.
Intel's self-learning neuromorphic chip, named Loihi, perhaps named after the Hawaiian seamount [[Loihi]], offers substantial power efficiency designed after the human brain. Intel claims Loihi is about 1000 times more energy efficient than the general-purpose computing power needed to train the neural networks that rival Loihi’s performance.
In theory, this would support both machine learning training and inference on the same silicon independently of a cloud connection, and more efficient than using [[convolutional neural network]]s (CNNs) or [[deep learning]] [[neural network]]s. Intel points to a system for monitoring a person’s heartbeat, taking readings after events such as exercise or eating, and uses the cognitive computing chip to normalize the data and work out the ‘normal’ heartbeat. It can then spot abnormalities, but also deal with any new events or conditions.
In theory, this would support both machine learning training and inference on the same silicon independently of a cloud connection, and more efficient than using [[convolutional neural network]]s (CNNs) or [[deep learning]] [[neural network]]s. Intel points to a system for monitoring a person’s heartbeat, taking readings after events such as exercise or eating, and uses the cognitive computing chip to normalize the data and work out the ‘normal’ heartbeat. It can then spot abnormalities, but also deal with any new events or conditions.
{{pov}}

The first iteration of the Loihi chip was made using Intel’s 14&nbsp;nm fabrication process, and houses 128 clusters of 1,024 artificial neurons each for a total of 131,072 simulated neurons.<ref>[http://www.zdnet.com/article/why-intel-built-a-neuromorphic-chip/ "Why Intel built a neuromorphic chip". September 29, 2017. www.ZDNet.com]</ref> This offers around 130 million synapses, which is still a rather long way from the human brain's 800 trillion synapses, and behind IBM’s [[TrueNorth]], which has around 16 billion by using 64 by 4,096 cores.<ref>[https://aitrends.com/future-of-ai/intel-unveils-loihi-neuromorphic/ "Intel unveils Loihi neuromorphic chip, chases IBM in artificial brains". October 17, 2017. AITrends.com]</ref>
The first iteration of the Loihi chip was made using Intel’s 14&nbsp;nm fabrication process, and houses 128 clusters of 1,024 artificial neurons each for a total of 131,072 simulated neurons.<ref>[http://www.zdnet.com/article/why-intel-built-a-neuromorphic-chip/ "Why Intel built a neuromorphic chip". September 29, 2017. www.ZDNet.com]</ref> This offers around 130 million synapses, which is still a rather long way from the human brain's 800 trillion synapses, and behind IBM’s [[TrueNorth]], which has around 16 billion by using 64 by 4,096 cores.<ref>[https://aitrends.com/future-of-ai/intel-unveils-loihi-neuromorphic/ "Intel unveils Loihi neuromorphic chip, chases IBM in artificial brains". October 17, 2017. AITrends.com]</ref>



Revision as of 15:51, 20 March 2018

A cognitive computer combines artificial intelligence and machine-learning algorithms, in an approach which attempts to reproduce the behaviour of the human brain.[1]

An example of neural network implementations of cognitive convolution and deep learning is provided by the IBM company's Watson machine. A subsequent development by IBM is the TrueNorth microchip architecture, which is designed to be closer in structure to the human brain than the von Neumann architecture used in conventional computers.[1] In 2017 Intel announced its own version of a cognitive chip in "Loihi", which will be available to university and research labs in 2018.

Intel Loihi chip

Intel's self-learning neuromorphic chip, named Loihi, perhaps named after the Hawaiian seamount Loihi, offers substantial power efficiency designed after the human brain. Intel claims Loihi is about 1000 times more energy efficient than the general-purpose computing power needed to train the neural networks that rival Loihi’s performance. In theory, this would support both machine learning training and inference on the same silicon independently of a cloud connection, and more efficient than using convolutional neural networks (CNNs) or deep learning neural networks. Intel points to a system for monitoring a person’s heartbeat, taking readings after events such as exercise or eating, and uses the cognitive computing chip to normalize the data and work out the ‘normal’ heartbeat. It can then spot abnormalities, but also deal with any new events or conditions.

The first iteration of the Loihi chip was made using Intel’s 14 nm fabrication process, and houses 128 clusters of 1,024 artificial neurons each for a total of 131,072 simulated neurons.[2] This offers around 130 million synapses, which is still a rather long way from the human brain's 800 trillion synapses, and behind IBM’s TrueNorth, which has around 16 billion by using 64 by 4,096 cores.[3]

IBM TrueNorth Chip

The IBM cognitive computers implement learning using Hebbian theory. Instead of being programmable in a traditional sense within machine language or a higher level programming language such a device learns by inputting instances through an input device that are aggregated within a computational convolution or neural network architecture consisting of weights within a parallel memory system. An early instantiation of such a device has been developed in 2012 under the Darpa SyNAPSE program at IBM directed by Dharmendra Modha.[citation needed]

In 2017 this IBM 64-chip array will contain the processing equivalent of 64 million neurons and 16 billion synapses, yet absolutely sips[clarification needed] energy; each processor consumes just 10 watts of electricity. Like other neural networks, this system will be put to use in pattern recognition and sensory processing roles. The Air Force wants to combine the TrueNorth ability to convert multiple data feeds — whether it's audio, video or text — into machine readable symbols with a conventional supercomputer's ability to crunch data.[citation needed] This isn't the first time that IBM's neural chip system has been integrated into cutting-edge technology. August, 2017 Samsung installed the chips in its Dynamic Vision Sensors enabling cameras to capture images at up to 2,000 fps while using just 300 milliwatts of power.[citation needed]

Criticism

There are many approaches and definitions for a cognitive computer,[4] and other approaches may be more fruitful.[5]

See also

References

  1. ^ a b Dharmendra Modha (interview), "A computer that thinks", New Scientist 8 November 2014, Pages 28-29
  2. ^ "Why Intel built a neuromorphic chip". September 29, 2017. www.ZDNet.com
  3. ^ "Intel unveils Loihi neuromorphic chip, chases IBM in artificial brains". October 17, 2017. AITrends.com
  4. ^ Schank, Roger C.; Childers, Peter G. (1984). The cognitive computer: on language, learning, and artificial intelligence. Addison-Wesley Pub. Co. ISBN 9780201064438.
  5. ^ Wilson, Stephen (1988). "The Cognitive Computer: On Language, Learning, and Artificial Intelligence by Roger C. Schank, Peter Childers (review)". Leonardo. 21 (2): 210. ISSN 1530-9282. Retrieved 13 January 2017.

Links

http://www.foxnews.com/tech/2018/01/09/ces-2018-intel-gives-glimpse-into-mind-blowing-future-computing.html