Quantum neural network
Quantum neural networks (QNNs) are neural network models which are based on the principles of quantum mechanics. There are two different approaches to QNN research, one exploiting quantum information processing to improve existing neural network models (sometimes also vice versa), and the other one searching for potential quantum effects in the brain.
- 1 Artificial quantum neural networks
- 2 Biological quantum neural networks
- 3 See also
- 4 References
- 5 External links
Artificial quantum neural networks
In the computational approach to quantum neural network research, scientists try to combine artificial neural network models (which are widely used in machine learning for the important task of pattern classification) with the advantages of quantum information in order to develop more efficient algorithms (for a review, see ). One important motivation for these investigations is the difficulty to train classical neural networks, especially in big data applications. The hope is that features of quantum computing such as quantum parallelism or the effects of interference and entanglement can be used as resources. Since the technological implementation of a quantum computer is still in a premature stage, such quantum neural network models are mostly theoretical proposals that await their full implementation in physical experiments.
Quantum neural network research is still in its infancy, and a conglomeration of proposals and ideas of varying scope and mathematical rigor have been put forward. Most of them are based on the idea of replacing classical binary or McCulloch-Pitts neurons with a qubit (which can be called a “quron”), resulting in neural units that can be in a superposition of the state ‘firing’ and ‘resting’.
The first ideas on quantum neural computation were published independently in 1995 by Ron Chrisley and Subhash Kak. Kak discussed the similarity of the neural activation function with the quantum mechanical Eigenvalue equation, and later discussed the application of these ideas to the study of brain function and the limitations of this approach. Ajit Narayanan and Tammy Menneer proposed a photonic implementation of a quantum neural network model that is based on the many-universe theory and “collapses” into the desired model upon measurement. Since then, more and more articles have been published in journals of computer science as well as quantum physics in order to find a superior quantum neural network model.
A lot of proposals attempt to find a quantum equivalent for the perceptron unit from which neural nets are constructed. A problem is that nonlinear activation functions do not immediately correspond to the mathematical structure of quantum theory, since a quantum evolution is described by linear operations and leads to probabilistic observation. Ideas to imitate the perceptron activation function with a quantum mechanical formalism reach from special measurements  to postulating non-linear quantum operators (a mathematical framework that is disputed). A direct implementation of the activation function using the circuit-based model of quantum computation has recently been proposed by Schuld, Sinayskiy and Petruccione based on the quantum phase estimation algorithm.
A substantial amount of interest has been given to a “quantum-inspired” model that uses ideas from quantum theory to implement a neural network based on fuzzy logic.
Some contributions reverse the approach and try to exploit the insights from neural network research in order to obtain powerful applications for quantum computing, such as quantum algorithmic design supported by machine learning. An example is the work of Elizabeth Behrman and Jim Steck, who propose a quantum computing setup that consists of a number of qubits with tunable mutual interactions. Following the classical backpropagation rule, the strength of the interactions are learned from a training set of desired input-output relations, and the quantum network thus ‘learns’ an algorithm.
Quantum associative memory
The quantum associative memory algorithm  has been introduced by Dan Ventura and Tony Martinez in 1999. The authors do not attempt to translate the structure of artificial neural network models into quantum theory, but propose an algorithm for a circuit-based quantum computer that simulates associative memory. The memory states (in Hopfield neural networks saved in the weights of the neural connections) are written into a superposition, and a Grover-like quantum search algorithm retrieves the memory state closest to a given input. An advantage lies in the exponential storage capacity of memory states, however the question remains whether the model has significance regarding the initial purpose of Hopfield models as a demonstration of how simplified artificial neural networks can simulate features of the brain.
Quantum computing via Sparse Distributed Representations
Rinkus  proposes that distributed representation, specifically sparse distributed representation (SDR), provides a classical implementation of quantum computing. Specifically, the set of SDR codes stored in an SDR coding field will generally intersect with each other to varying degrees. In other work, Rinkus describes a fixed time learning (and inference) algorithm that preserves similarity in the input space into similarity (intersection size) in the SDR code space. Assuming that input similarity correlates with probability, this means that any single active SDR code is also a probability distribution over all stored inputs, with the probability of each input measured by the fraction of its SDR code that is active (i.e., the size of its intersection with the active SDR code). The learning/inference algorithm can also be viewed as a state update operator and because any single active SDR simultaneously represents both the probability of the single input, X, to which it was assigned during learning and the probabilities of all other stored inputs, the same physical process that updates the probability of X also updates all stored probabilities. By 'fixed time', it is meant that the number of computational steps comprising this process (the update algorithm) remains constant as the number of stored codes increases. This theory departs radically from the standard view of quantum computing and quantum physical theory more generally: rather than assuming that the states of the lowest level entities in the system, i.e., single binary neurons, exist in superposition, it assumes only that higher-level, i.e., composite entities, i.e., whole SDR codes (which are sets of binary neurons), exist in superposition.
Most learning algorithms follow the classical model of training an artificial neural network to learn the input-output function of a given training set and use a classical feedback loops to update parameters of the quantum system until they converge to an optimal configuration. Learning as a parameter optimisation problem has also been approached by adiabatic models of quantum computing. Recently a new post-learning strategy to allow the search for improved set of weights based on analogy with quantum effects occurring in nature. The technique, proposed in  is based on the analogy of modeling a biological neuron as a semiconductor heterostructure consisting of one energetic barrier sandwiched between two energetically lower areas. The activation function of the neuron is therefore considered as a particle entering the heterostructure and interacting with the barrier. In this way auxiliary reinforcement to the classical learning process of neural networks is achieved with minimal additional computational costs.
One way of constructing a quantum neuron is to first generalise classical neurons (by padding of ancillary bits) to reversible permutation gates and then generalising them further to make unitary gates. Due to the no-cloning theorem in quantum mechanics, the copying of the output before sending it to several neurons in the next layer is non-trivial. This can be replaced with a general quantum unitary acting on the output plus a dummy bit in state |0⟩. That has the classical copying gate (CNOT ) as a special case, and in that sense generalises the classical copying operation. It can be demonstrated that in this scheme, the quantum neural networks can: (i) compress quantum states onto a minimal number of qubits, creating a quantum autoencoder, and (ii) discover quantum communication protocols such as teleportation. The general recipe is theoretical and implementation-independent. The quantum neuron module can naturally be implemented photonically.
Biological quantum neural networks
Although many quantum neural network researchers explicitly limit their scope to a computational perspective, the field is closely connected to investigations of potential quantum effects in biological neural networks. Models of cognitive agents and memory based on quantum collectivees have been proposed by Subhash Kak, but he also points to specific problems of limits on observation and control of these memories due to fundamental logical reasons. The combination of quantum physics and neuroscience also nourishes a vivid debate beyond the borders of science, an illustrative example being journals such as NeuroQuantology  or the healing method of Quantum Neurology. However, also in the scientific sphere theories of how the brain might harvest the behavior of particles on a quantum level are controversially debated. The fusion of biology and quantum physics recently gained momentum by the discovery of signs for efficient energy transport in photosynthesis due to quantum effects. However, there is no widely accepted evidence for the ‘quantum brain’ yet.
- da Silva, Adenilton J.; Ludermir, Teresa B.; de Oliveira, Wilson R. "Quantum perceptron over a field and neural network architecture selection in a quantum computer". Neural Networks. 76: 55–64. doi:10.1016/j.neunet.2016.01.002.
- Panella, Massimo; Martinelli, Giuseppe. "Neural networks with quantum architecture and quantum learning". International Journal of Circuit Theory and Applications. 39: 61–77. doi:10.1002/cta.619.
- M. Schuld, I. Sinayskiy, F. Petruccione: The quest for a Quantum Neural Network, Quantum Information Processing 13, 11, pp. 2567-2586 (2014)
- R. Chrisley, Quantum Learning, In New directions in cognitive science: Proceedings of the international symposium, Saariselka, 4-9 August 1995, Lapland, Finland, P. Pylkkänen and P. Pylkkö (editors). Finnish Association of Artificial Intelligence, Helsinki, 77-89 (1995)
- S. Kak, On quantum neural computing, Advances in Imaging and Electron Physics 94, 259 (1995)
- S. Kak, The three languages of the brain: quantum, reorganizational, and associative. In Learning as Self- Organization, K. Pribram and J. King (editors). Lawrence Erlbaum Associates, Mahwah, NJ, 185-219 (1996)
- A. Gautam and S. Kak, Symbols, meaning, and origins of mind. Biosemiotics (Springer Verlag) 6: 301-310 (2013)
- A. Narayanan and T. Menneer: Quantum artificial neural network architectures and components, Information Sciences 128, 231-255 (2000)
- M. Perus: Neural Networks as a basis for quantum associative memory, Neural Network World 10 (6), 1001 (2000)
- M. Zak, C.P. Williams: Quantum Neural Nets, International Journal of Theoretical Physics 37(2), 651 (1998)
- S. Gupta, R. Zia: Quantum Neural Networks, Journal of Computer and System Sciences 63(3), 355 (2001)
- J. Faber, G.A. Giraldi: Quantum Models for Artificial Neural Network (2002), Electronically available: http://arquivosweb.[permanent dead link] lncc.br/pdfs/QNN-Review. pdf
- M. Schuld, I. Sinayskiy, F. Petruccione: Simulating a perceptron on a quantum computer ArXiv:1412.3635 (2014)
- G. Purushothaman, N. Karayiannis: Quantum Neural Networks (QNN’s): Inherently Fuzzy Feedforward Neural Networks, IEEE Transactions on Neural Networks, 8(3), 679 (1997)
- J. Bang et al. : A strategy for quantum algorithm design assisted by machine learning, New Journal of Physics 16 073017 (2014)
- E.C. Behrman, J.E. Steck, P. Kumar, K.A. Walsh: Quantum Algorithm design using dynamic learning, Quantum Information and Computation, vol. 8, No. 1&2, pp. 12-29 (2008)
- D. Ventura, T. Martinez: A quantum associative memory based on Grover's algorithm, Proceedings of the International Conference on Artificial Neural Networks and Genetics Algorithms, pp. 22-27 (1999)
- G. Rinkus (2102): Quantum Computation via Sparse Distributed Representation. Neuroquantology 10(2) 311-315
- G. Rinkus (1996): A Combinatorial Neural Network Exhibiting Episodic and Semantic Memory Properties for Spatio-Temporal Patterns. Doctoral Thesis. Boston University. Boston, MA.
- G. Rinkus (2010): A cortical sparse distributed coding model linking mini- and macrocolumn-scale functionality. Frontiers in Neuroanatomy 4:17
- H. Neven et al.: Training a Binary Classifier with the Quantum Adiabatic Algorithm, arXiv:0811.0416v1 (2008)
- Kapanova, K. G., I. Dimov, and J. M. Sellier. "On randomization of neural networks as a form of post-learning strategy." arXiv preprint arXiv:1511.08366 (2015).
- Wan, Kwok-Ho; Dahlsten, Oscar; Kristjansson, Hler; Gardner, Robert; Kim, Myungshik (2016). "Quantum generalisation of feedforward neural networks". Bibcode:2016arXiv161201045W. arXiv: .
- W. Loewenstein: Physics in mind. A quantum view of the brain, Basic Books (2013)
- H. Stapp: Mind Matter and Quantum Mechanics, Springer, Heidelberg (2009)
- S. Kak, Biological memories and agents as quantum collectives. NeuroQuantology 11: 391-398 (2013)
- S. Kak, Observability and computability in physics, Quantum Matter 3: 172-176 (2014)
- S. Hameroff: Quantum computation in brain microtubules? The Penrose-Hameroff 'Orch-OR' model of consciousness, Philosophical Transactions Royal Society of London Series A, 356 1743 1869 (1998)
- E. Pessa, G. Vitiello: Bioelectrochemistry and Bioenergetics, 48 2 339 (1999)
- Neukart, Florian (2013). "On Quantum Computers and Artificial Neural Networks". Signal Processing Research. 2 (1).
- Neukart, Florian (2014). "Operations on Quantum Physical Artificial Neural Structures". Procedia Engineering. 2 (1): 1509–1517. doi:10.1016/j.proeng.2014.03.148.