Zero instruction set computer

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

In computer science, zero instruction set computer (ZISC) refers to a computer architecture based solely on pattern matching and absence of (micro-)instructions in the classical[clarification needed] sense. These chips are known for being thought of as comparable to the neural networks, being marketed for the number of "synapses" and "neurons".[1]

The acronym ZISC alludes to reduced instruction set computer (RISC).[citation needed]

History[edit]

ZISC is a technology[clarification needed] based on ideas[which?] from artificial neural networks and massively parallel processing. This concept[which?] was invented by Guy Paillet.[2]

Design[edit]

The ZISC architecture alleviates the memory bottleneck[clarification needed] by blending pattern memory with pattern learning and recognition logic.[how?]

Their massively parallel computing solves the "winner takes all problem in action selection"[clarification needed from Winner-takes-all problem in Neural Networks] by allotting each "neuron" its own memory and allowing simultaneous problem-solving the results of which are settled up disputing with each other.[2]

Commercial production[edit]

IBM released the first ZISC35[clarification needed] with 36 neurons in 1993 and the ZISC78[clarification needed] followed in 2000. Manufacturing was discontinued in 2001[why?].

Between 1993 and 2010, DARPA and Intel co-developed the NI1000 around the same time of IBM's patent.[3]

In August 2007, the CM1K (CogniMem 1,024 neurons) was introduced by CogniMem Ltd. CM1K was designed by Anne Menendez and Guy Paillet.

Practical uses of ZISC focus on pattern recognition, information retrieval (data mining), security and similar tasks.[why?]

Applications and controversy[edit]

According to TechCrunch, software emulations of these types of chips are currently used for image recognition by many large tech companies, such as Facebook and Google. When applied to other miscellaneous pattern detection tasks, such as with text, results are said to be produced in microseconds even with chips released in 2007.[1]

Junko Yoshida, of the EE Times, compared the NeuroMem chip with "The Machine", a machine capable of being able to predict crimes from scanning people's faces, from Person of Interest (TV series) describing it as "the heart of big data" and "foreshadow[ing] a real-life escalation in the era of massive data collection".[4]

See also[edit]

References[edit]

External links[edit]