This article contains content that is written like an advertisement. (January 2017) (Learn how and when to remove this template message)
An example of neural network implementations of cognitive convolution and deep learning is provided by the IBM company's Watson machine. A subsequent development by IBM is the TrueNorth microchip architecture, which is designed to be closer in structure to the human brain than the von Neumann architecture used in conventional computers. In 2017 Intel announced its own version of a cognitive chip in "Loihi", which will be available to university and research labs in 2018.
Intel Loihi chip
Intel's self-learning neuromorphic chip, named Loihi, perhaps named after the Hawaiian seamount Loihi, offers substantial power efficiency designed after the human brain. Intel claims Loihi is about 1000 times more energy efficient than the general-purpose computing power needed to train the neural networks that rival Loihi’s performance. In theory, this would support both machine learning training and inference on the same silicon independently of a cloud connection, and more efficient than using convolutional neural networks (CNNs) or deep learning neural networks. Intel points to a system for monitoring a person’s heartbeat, taking readings after events such as exercise or eating, and uses the cognitive computing chip to normalize the data and work out the ‘normal’ heartbeat. It can then spot abnormalities, but also deal with any new events or conditions.
The neutrality of this section is disputed. (March 2018) (Learn how and when to remove this template message)
The first iteration of the Loihi chip was made using Intel’s 14 nm fabrication process, and houses 128 clusters of 1,024 artificial neurons each for a total of 131,072 simulated neurons. This offers around 130 million synapses, which is still a rather long way from the human brain's 800 trillion synapses, and behind IBM’s TrueNorth, which has around 16 billion by using 64 by 4,096 cores.
IBM TrueNorth Chip
The IBM cognitive computers implement learning using Hebbian theory. Instead of being programmable in a traditional sense within machine language or a higher level programming language such a device learns by inputting instances through an input device that are aggregated within a computational convolution or neural network architecture consisting of weights within a parallel memory system. An early instantiation of such a device has been developed in 2012 under the Darpa SyNAPSE program at IBM directed by Dharmendra Modha.
In 2017 this IBM 64-chip array will contain the processing equivalent of 64 million neurons and 16 billion synapses, yet each processor consumes just 10 watts of electricity. Like other neural networks, this system will be put to use in pattern recognition and sensory processing roles. The Air Force wants to combine the TrueNorth ability to convert multiple data feeds — whether it's audio, video or text — into machine readable symbols with a conventional supercomputer's ability to crunch data. This isn't the first time that IBM's neural chip system has been integrated into cutting-edge technology. August, 2017 Samsung installed the chips in its Dynamic Vision Sensors enabling cameras to capture images at up to 2,000 fps while using just 300 milliwatts of power.
- Dharmendra Modha (interview), "A computer that thinks", New Scientist 8 November 2014, Pages 28-29
- "Why Intel built a neuromorphic chip". September 29, 2017. www.ZDNet.com
- "Intel unveils Loihi neuromorphic chip, chases IBM in artificial brains". October 17, 2017. AITrends.com
- Schank, Roger C.; Childers, Peter G. (1984). The cognitive computer: on language, learning, and artificial intelligence. Addison-Wesley Pub. Co. ISBN 9780201064438.
- Wilson, Stephen (1988). "The Cognitive Computer: On Language, Learning, and Artificial Intelligence by Roger C. Schank, Peter Childers (review)". Leonardo. 21 (2): 210. ISSN 1530-9282. Retrieved 13 January 2017.
|This computing article is a stub. You can help Wikipedia by expanding it.|