Jump to content

Talk:Quantum computing

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Joepnl (talk | contribs) at 00:10, 13 June 2018 (→‎I don't understand this article at all: new section). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Former featured articleQuantum computing is a former featured article. Please see the links under Article milestones below for its original nomination page (for older articles, check the nomination archive) and why it was removed.
Article milestones
DateProcessResult
January 19, 2004Refreshing brilliant proseKept
May 9, 2006Featured article reviewKept
May 13, 2007Featured article reviewDemoted
Current status: Former featured article

Contradictory statements regarding space?

The introduction suggests that simulating an n-bit quantum computer on a classical TM requires 2^n discrete states. But doesn't this contradict the later statement that QBP is a subset of PSPACE? (Erniecohen (talk) 15:50, 14 October 2012 (UTC))[reply]

It's just an example of how a classical computer could simulate a quantum computer. It doesn't mean that it's the only way to do so. --Robin (talk) 15:55, 14 October 2012 (UTC)[reply]
The problem is that the way that it is written, it strongly implies that quantum computing provides a space advantage, which is just false, so the 2^500 crap should just be removed. The relevant connection between the models is just that a classical TM can simulate a quantum computer with a polynomial blowup in space, but is strongly believed to require an exponential blowup in time. — Preceding unsigned comment added by Erniecohen (talkcontribs) 16:04, 14 October 2012 (UTC)[reply]
I wasn't aware of this. Do you have a reference? Skippydo (talk) 19:27, 14 October 2012 (UTC)[reply]

Has this been resolved? It seems odd to me in any case to say that 2500 complex values is equivalent to 2501 bits (does this mean that there are only four complex values available)? W. P. Uzer (talk) 12:27, 2 February 2014 (UTC)[reply]

Given these two objections, I've commented out the sentence in question for the moment. W. P. Uzer (talk) 08:50, 4 February 2014 (UTC)[reply]

How does it work... really!

It seems like this article would benefit from explaining how quantum computers work physically. I assume they manipulate individual atoms, but how?

It seems to me that nobody really knows how it (sort of) works. It seems the researchers in the lab are playing with atomic scale phenomena that they don't fully understand. Since these researchers were trained to believe in quantum theory as the undisputed truth they preach their religion as fact. In the end they create mechanisms that can do some sort of awkward computing and they call these mechanisms "Quantum Computers". — Preceding unsigned comment added by 31.168.26.225 (talk) 15:39, 15 June 2016 (UTC)[reply]

Comment so that archiving will work. (Answer already in article.) — Arthur Rubin (talk) 12:28, 31 May 2015 (UTC)[reply]
and where is the critique section of this article? — Preceding unsigned comment added by 75.163.218.83 (talk) 05:12, 4 September 2015 (UTC)[reply]
i second that nomination , do you know that so called supercomputers are idle most of time. having more states doesnt not equate to efficiency, and superpostion collapse implies even less output. less states. Juror1 (talk) 17:59, 30 November 2017 (UTC)[reply]
Agreed. In the overview we find the statement: "Quantum computers share theoretical similarities with non-deterministic and probabilistic computers." Wow. Do you seriously suppose that the normal Wikipedia reader has any clue about determinism or probabilistic systems? Why even write something like that at the top of an article. Anyone who already knows about determinism, etc., also (probably) has some knowledge about Quantum Systems. The author of that statement could not do any better at pushing casual readers away from this topic. If anything, we need to draw more people into Computer Science in general, and bleeding-edge research in specific. Bloviation like the statement I referenced does much more harm than good by immediately throwing out arcane, industry-specific terminology. It literally turns readers off. I'm a computer scientist / engineer by education and trade, and it turned me off Big Time. Come on people. Write something that actual people can read and comprehend. Use your words. And not the 3-dollar ones, either. Plain English will do just fine.
There's an old-saying about being able to explain something in simple terms ... If you can't, you probably don't understand the subject yourself. Cheers. — Preceding unsigned comment added by 98.194.39.86 (talk) 03:44, 25 October 2016 (UTC)[reply]

Announcing new results

/(

"..(-2,334)‎ . .(Sorry, Wikipedia is not the place to announce new results. See WP:OR) .." - and wghere, WHERE is such place, for new ideas, announcing and discuss them, developing them more, and .."brainstorming",maybe ?, or just constructive discussion.. yap, according that "WP:OR" document, if we all would strictly managed by it, it would be means, - all advancing, all scientific and technical development would be almost stopped, in fact.. bcos no one would know about any new ideas, new results, new.. anything.. :/(

\( — Preceding unsigned comment added by Martin Hovorka (talkcontribs) 19:30, 2 September 2013 (UTC)[reply]

deleting of new idea announcing

/(

hmm... not even just announce,a bit, any new idea (imho quite promising, relevant, and ..perspective.., this my multi-Sieves/Sifthering Q.C. Idea) ..even nor get,make a little advertisement,and spreading-out, for this new,constructive,fresh-brigth concept.. :\( (..but it is, and always was, my idea, trying make Q.Computing with this concept of sieve, /sifter approach,/concept(sieving) — Preceding unsigned comment added by Martin Hovorka (talkcontribs) 20:57, 2 September 2013 (UTC)[reply]

It's your fault for announcing the idea here. Doesn't everyone know by now that, if you want to patent an invention, you must do so within one year of publication. Under the new US patent law, publication may make it impossible to patent it at all. (And, yes, posting it on Wikipedia constitutes "publication".) — Arthur Rubin (talk) 21:07, 2 September 2013 (UTC)[reply]

other wiki for discussing quantum ideas

There is a common misconception that people who support the WP:OR policy are people who don't want to see any original research on any wiki.

Actually, some of us *do* want to see original research on a wiki -- but on an appropriate wiki where such research is on-topic, not just any random wiki.

There are over 10,000 English-language wiki. Pretty much any topic you can think of is on-topic at at least one of them. In particular, quantum computing seems to be on topic at several wiki including http://quantiki.org/ , Wikia: 3dids, http://quantum.cs.washington.edu/ , http://wiki.qcraft.org/ , http://c2.com/cgi/wiki?QuantumComputing , http://twoqubits.wikidot.com/ , http://chemwiki.ucdavis.edu/Physical_Chemistry/Quantum_Mechanics , http://www.physics.thetangentbundle.net/wiki/Quantum_mechanics , etc.

--DavidCary (talk) 04:30, 15 January 2014 (UTC)[reply]

Quantum supercomputer

Is it really the case that a quantum computer is "also known as a quantum supercomputer"? I've never seen that usage before & suspect it should be deleted. --24.12.205.104 (talk) 00:48, 6 January 2014 (UTC)[reply]

misc suggestions / solving chess - programming the universe - more intuitive intro

I think it would be great if the article mentioned solving chess as something that quantum computing would allow.

I also think Seth Lloyd's book "Programming the Universe" should be referenced somewhere.

The first sentence I think is too technical. It tells precisely what the term means in physics, it doesn't give any layman sense as to what quantum computing is. It almost sounds like: "Quantum computing is a process where computers use quantum systems to do computing". A layman will want to know something basic immediately like the fact that quantum computers are extremely fast and will soon be replacing standard computers.

...My 2 cents. Squish7 (talk) 00:40, 26 March 2014 (UTC)[reply]

Page title should be "Quantum computing"

To my sensibility, this page title is all wrong, or maybe not entirely - simply we have to create a more general page about more "general aspects and applications of quantum computing" like some quantum transistors that work partially as "quantum computers" but are cheaper to produce.

Computer science is no more about computers than astronomy is about telescopes.Edsger Dijkstra

Additionally, Google

 "quantum computing" -"quantum computer"

gives 2,440,000 results

 -"quantum computing" "quantum computer" 

gives 961,000 results, so one appears without the other significantly more often. — MaxEnt 05:10, 5 May 2014 (UTC)[reply]

 DoneRuud 18:01, 21 December 2014 (UTC)[reply]

Quantum games

A collection of IPs is adding an announcement of "the first quantum computer game". Even if this were sourced to a reliable source (it's not at all in the reference specified, which is a blog [not even a blog entry], and all I can find is a blog entry pointing to the announcement by the creator), would it be appropriate for the article? — Arthur Rubin (talk) 05:49, 21 May 2014 (UTC)[reply]

The announcement by the alleged creator is not a reliable source, either. — Arthur Rubin (talk) 10:39, 24 May 2014 (UTC)[reply]

strongly need th consider of change the title of this wiki-page from Q. "Computer" to "COMPUTING"

can be title of whole wiki-page re-turned to be Quantum COMPUTING, as it was before ? as "Computing" is far more fitting, more overal term for this wiki page´ topic, as just 1 concrete THE "computer") — Preceding unsigned comment added by 78.99.236.255 (talk) 17:49, 23 July 2014 (UTC)[reply]

 DoneRuud 18:01, 21 December 2014 (UTC)[reply]

Invented When?

You should discuss when Quantum computing was invented and the purpose of Quantum Computing at the time.

2601:C:5600:27B:C92A:EEBE:6758:C647 (talk) 17:52, 18 January 2015 (UTC)Johnnie DeHuff[reply]

The basic idea occurred as early as 1965, in Feynman Lectures on Physics,vol. 3 when Feynman mentioned that it was frequently faster to do the physical experiment than to do the mathematical computation. In other words, use a physics experiment as an analog computer. --Ancheta Wis   (talk | contribs) 17:56, 1 February 2015 (UTC)[reply]

Minthreadsleft

Why is it 10? If we're going to archive, 5, or possibly 2, would be better. I already archived one thread which was (potentially) about the subject, rather than about the article. — Arthur Rubin (talk) 22:52, 4 May 2015 (UTC)[reply]

Changed to 5. I also "archived" 3 sections about alleged simulators of quantum computers, with no credible source. — Arthur Rubin (talk) 12:30, 31 May 2015 (UTC)[reply]

Three-bit Example?

It would be very helpful to see an example of a calculation or problem solved at the three bit level. Is there such a thing? JFistere (talk) 03:11, 15 October 2015 (UTC)[reply]

simplify intro!

The technical expertise in this article is very good, but way too technical at the intro. The intro should be a thesis statement especially for newbies. I'll back-read the terminology you used and see if I can glean enough to simplify the intro for you. No promises though!

By the way, are the states referred to in the article the spin of the qbit?

I just found a good nuts 'n bolts explanation here...http://www.wired.com/2014/05/quantum-computing/-Pb8bije6a7b6a3w (talk) 18:57, 26 October 2015 (UTC)[reply]

misleading lede

  • "Given sufficient computational resources, however, a classical computer could be made to simulate any quantum algorithm, as quantum computation does not violate the Church–Turing thesis.[10]"

The whole article is currently very tech. Which is OK with me, even though I cannot understand it well enough to judge how correct it is... But it needs more content that is at a lower level. And it needs to be careful not to mislead readers who are not so tech. The above sentence currently concluding the lede is a prime example of content which is probably correct in theory but seriously misleading, particularly in the lede context. Most ordinary readers would not understand that "sufficient computational resources" includes unbounded quantities thereof, with no regard for feasibility. Yes, we may think quantum computers can only solve problems that a classical computer could solve if it was big enough and had enough time to solve. But there are many problems that would require using the entire universe to solve (organized as a classical computer) and still take longer than the expected life of the universe. If we think quantum computers will be able to solve some such problems (within feasible, limited, size and time constraints), we can fairly say there is a distinct difference in the problems these two classes of computers can solve, even if both classes of computers are only able to solve (all) Turing problems -- "given enough resources". I don't really know enough to know how the lede should be fixed -- but maybe I will Be Bold anyway, to get the ball rolling...-71.174.188.32 (talk) 19:11, 10 December 2015 (UTC)[reply]

I could not agree more. This article reads like an excerpt from a sub-par college textbook for a class called: "Hypothetical Quantum Computing Systems" or something. Text books are fine when confined to Academia. Wikipedia is supposed to be for the masses. And with so much jargon being tossed about, it makes it appear that this subject is well-defined and based on real-world working models; which is simply NOT the case. 99% of what I read is pure theory at this time (2016). 98.194.39.86 (talk) 03:54, 25 October 2016 (UTC)[reply]

Do I have this right?

I've been trying to ground myself in this subject in order to attempt a good thesis statement. The way I understand it, those qbits would explore all possibilities for a given expression - between two clock ticks of the quantum computer. Is that right? Pb8bije6a7b6a3w (talk) 00:28, 11 December 2015 (UTC)[reply]

No, that's not right. "Exploring all possibilities" is not an accurate description of quantum computing. For comparison with the double-slit experiment, it is not correct to say that "a photon explores both slits and then chooses the right one". It is equally incorrect to say that "a quantum computer explores all possibilities and then chooses the right one". Think of quantum computing as operations that are meant to have a specific impact on correlated interference patterns. MvH (talk) 04:59, 14 January 2017 (UTC)[reply]

Quantum vials / cartridge entanglement

Large multi-mono-atomic groupings cannot maintain cohesion for long. An old idea is to replace each atom with a vial (cartridge), so now each vial plays the role of a single atom, then we entangle the vials among each other. The problem is that particles inside a vial may entangle with other ones of the same vial (so now the vial doesn't act as a single atom), in order to avoid that we use few atoms on each vial [seems clever but it isn't for other reasons/small mistakes become important] and we cool each vial to make it almost a condensate [seems clever but it isn't for other reasons/each person should have a quantum computers, so superconductivity at room temperature is the only solution, not extreme temperatures]. Some reverberative NON-MEASURED laser radiation (the laser should allow different quantum states of oscillation that will be not revealed to us) will increase cohesion inside the mini vats. Sounds great but until now the result is randomness. Crucial details are under development. Akio (Akio) 00:28, 13 May (UTC)

classical computer analogues

A classical computer analogue, uses a noise generator to introduce noise when needed and entangled quantum rotation statistics. Then we run trillions of times the wavefunctional collapse in order we average the result (most tasks though do not require large averagings, so even a billion or a million might work for simple tasks). That quantum analugue is 10^7 times slower than an ideal quantum computer, but it works, also we don't have large arrayed quantum entangletrons (computer). Human averaging algorithms might not be perfect, but a good noise generator is way better than the natural noise if we add some mistakes of measurement of a future actual large entangletron (quantum computer). Yoritomo (Yoritomo) 04:01, 15 May (UTC)

Each digit should be partially connected (at variable rates that change for each task) with all other digits (at a unique percentage to each other digit), but also to separate non-interconnected randomness (at variable rates that change for each task, at a unique percentage to each other digit). This is called "tensor computing". That works fine for many tasks, you might not get the absolute answer but you will get close after a trillion trials. If you want the absolute answer (well to get even closer, because even getting the absolute answer is something probabilistic), you then should allow partial variable dynamics take action, a combinatorics method of "communicating vessels" (well partially communicating at different rates for each digit and partially introducing variable separate-noise rates to each digit). These combinatoric communicating vessels use as their grain an algorithmic module that acts as the smallest particle. An actual quantum computer uses as a grain-module an infinitely small module and we cannot, but we can get very close and minimize our grain-module enough to perform each task with a classical super computer. Nowadays a classical pseudo-entangletron is better than an entangletron (quantum computer) simply because we can already build a very complex one! Ieyasu (Ieyasu) 04:32, 15 May (UTC)
your first digit is totally random, all other digits are probabilistically separately random (not of the same random series) with some statistical probability of indruducing the initial random digit— Preceding unsigned comment added by 2A02:587:4103:2300:C8D5:96C1:1BF2:71A7 (talk)

Hello fellow Wikipedians,

I have just modified one external link on Quantum computing. Please take a moment to review my edit. If you have any questions, or need the bot to ignore the links, or the page altogether, please visit this simple FaQ for additional information. I made the following changes:

When you have finished reviewing my changes, please set the checked parameter below to true or failed to let others know (documentation at {{Sourcecheck}}).

This message was posted before February 2018. After February 2018, "External links modified" talk page sections are no longer generated or monitored by InternetArchiveBot. No special action is required regarding these talk page notices, other than regular verification using the archive tool instructions below. Editors have permission to delete these "External links modified" talk page sections if they want to de-clutter talk pages, but see the RfC before doing mass systematic removals. This message is updated dynamically through the template {{source check}} (last update: 5 June 2024).

  • If you have discovered URLs which were erroneously considered dead by the bot, you can report them with this tool.
  • If you found an error with any archives or the URLs themselves, you can fix them with this tool.

Cheers.—InternetArchiveBot (Report bug) 12:43, 21 July 2016 (UTC)[reply]

No mention of China?

Claims of quantum computing superiority (including launching of a quantum satellite) in second-half of Sept. 2016 interview at: https://www.rt.com/shows/keiser-report/358371-episode-max-keiser-963/ 72.171.152.192 (talk) 18:17, 17 September 2016 (UTC)[reply]

Error in "Basis" section?

"A double qubit can represent a one, a zero, or any quantum superposition of those two qubit states; a pair of qubits can be in any quantum superposition of 4 states, and three qubits..."

Should presumably read "A single qubit..." etc. Northutsire (talk) 19:24, 4 October 2016 (UTC)[reply]

Error in usage of the term non-determinism

The term non-deterministic was used twice in the article, the first time properly, in the sentence "Quantum computers share theoretical similarities with non-deterministic and probabilistic computers.", and the second time in a wrong way in the sentence "Quantum algorithms are often non-deterministic, in that they provide the correct solution only with a certain known probability." It is highly misleading as the term non-deterministic computing and the term probabilistic computing are very different in computer science. I corrected the error and clarified that the term non-determinism must not be used in that context. Of course, an alternative is simply to replace non-deterministic by probabilistic in the previous version, yet I find this to be a very good place to clarify to the physicists among the readers that the term non-determinism cannot be used for describing the probabilistic nature of (regular) quantum computers. Tal Mor (talk) 08:10, 29 November 2016 (UTC)[reply]

Sorry - I meant to clarify the issue to all the readers that are not computer scientists (not just to the physicists). Tal Mor (talk) 08:13, 29 November 2016 (UTC)[reply]

Make paragraph, but also new page: Quantum computing and deep learning

No human being can generate some complicated questions to ask a quantum computer about molecular statistics of mixed materials, so we define the parameters of our questions, but the final question is a result of computing! You don't need a computer only to get an answer, but even to finalize hard questions. We must be more analytical about how a quantum computer might act as a neural network, or how a neural network might control a quantum computer. Don't write random stuff. Collect some information for 4 years, and then write here. — Preceding unsigned comment added by 2A02:587:4116:2200:A123:2E94:AB7B:DF1 (talk) 11:10, 17 April 2017 (UTC)[reply]

D-Wave, Google, 2017

This youtube video discusses a paper by google employees,[1] which claims or demonstrates the computational advantage of D-Wave's quantum annealing concept, and I don't yet see it being discussed in the article. JMK (talk) 19:40, 3 September 2017 (UTC)[reply]

  1. ^ Denchev, Vasil S.; Boixo, Sergio; Isakov, Sergei V.; Ding, Nan; Babbush, Ryan; Smelyanskiy, Vadim; Martinis, John; Neven, Hartmut (1 August 2016). "What is the Computational Value of Finite-Range Tunneling?". Physical Review X. 6 (3). doi:10.1103/PhysRevX.6.031015. Retrieved 3 September 2017.

description of digital computer incorrect

who will fix it?Juror1 (talk) 17:53, 30 November 2017 (UTC)[reply]

Too complex first sentence

The first two sentences read:

Quantum computing studies computational systems (quantum computers) that make direct use of quantum-mechanical phenomena, such as superposition and entanglement, to perform operations on data. Quantum computers are different from binary digital electronic computers based on transistors.

I changed to:

Quantum computing is computing using quantum-mechanical phenomena, such as superposition and entanglement. Quantum computers are devices that perform quantum computing. They are different from binary digital electronic computers based on transistors.

This was reversed with the motivation that the previous version "reads better". Here is my motivation for why my version is better:

It is shorter. It is divided into more sentences. (It is a well established fact that shorter sentences are easier to read.) It says what the article is about. The previous version assumes, but does not say, that QC is a subject area. It says that QC is a phenomenon and not the study of this phenomenon. Isn't it decided that the phenomenon itself is more important than the study of it? Removing to perform operations on data makes it shorter, which is good. The first sentence should never use a word that is so difficult that it needs to be explained in the same sentence. (Because the first sentence should be as short as possible.) I think computing is OK. If it is too difficult for the reader, then reading the article about computing first is necessary to understand this article. --Ettrig (talk) 11:42, 13 December 2017 (UTC)[reply]

D-Wave 2000 Qubits - commercial today !?

https://www.dwavesys.com/d-wave-two-system
Isn't this a real Quantum Computer ? With 2000 Qubits !!!! It seems to me like they and NASA once and for all have solved the necessary superconducting problem, through vacuum. No particles is no temperature - and the superconducting problem solved - !???
Although expensive, large companies can afford them - and make more money perhaps ? Boeing720 (talk) 00:31, 19 December 2017 (UTC)[reply]

The D-Wave device is not a universal quantum computer, see D-Wave Systems#Controversy. Luca (talk) 09:16, 19 December 2017 (UTC)[reply]

Explanations for the masses

I'd like to join the chorus asking for a bit more content pitched to the educated layman. In particular, one point that brought me here in an unsuccessful search for an explanation doesn't seem to be found.

I second the call for a worked-through 3-qubit problem if that's feasible. That might help me visualize how, when the superposition collapses, you know that you are observing the solution to the problem you were running on the quantum computer. That seems to be saying that there is only one possible position, which doesn't track with the idea of superposition. Grossly simplified, if the problem you are trying to solve is 2+2, how do you manipulate the qubits to collapse the probabilities down to 100% for 4? MvH took a stab at this in #14 on the talk page, but it was too general to make much sense to a layman.210.178.60.82 (talk) 12:41, 19 January 2018 (UTC)[reply]

quantum supremacy

The initial addition on Quantum Supremacy (16:54, 5 July 2017 by user Schlafly) provided a definition of the term, and seemed to add additional information to the base subject matter of quantum computing. On 13:29, 12 December 2017, user 185.78.8.131 edited the section and added material relating to assertion of controversy about the concept of Quantum Supremacy, and apparently to quantum mechanics in general.

First, controversy is fine and I feel it should be placed into it's own section rather than being post-pended to an existing one. However, it may better placed in the article on quantum mechanics where it may receive a greater level of review.

Second, the sentence starting with "Those such as Roger Schlafly," caught my attention. Wondering to whom "those" referred, I checked the three references given, and found them to all point to a blog authored by Schlafly. To me, this seems to run afoul of original-research and or NPOV and or reliable sources.

So I've flagged this section with POV-section. — Preceding unsigned comment added by Debrown (talkcontribs) 17:27, 1 February 2018 (UTC)[reply]

non fundamental mimicking of the shared noise attractor which may include a percentage of constant values (some percentage of the time the attractor is activated, so we have pseudorandom data within a range of values, and some percentage of the time the machine runs, a constant value is appearing according to the entaglement angles chart, but a randomizer selects among a. the pseudorandom data of the attractor and b. the constant value - a. and b. have standard percentages of activations, but they appear randomly via a randomizer - we have to run the program many times in order we average the results.)


This isn't an actual quantum computer but a Pseudoquantum one.

It's useful for some problems. — Preceding unsigned comment added by 2A02:2149:8430:F500:24AD:8759:82E7:D432 (talk) 16:23, 9 February 2018 (UTC)[reply]

and how to find the correct constant number and the correct attractor (at the beginning you can start with known problems and use it for educational reasons, then you can upgrade the system) — Preceding unsigned comment added by 2A02:2149:8430:F500:24AD:8759:82E7:D432 (talk) 16:31, 9 February 2018 (UTC)[reply]

noise testing

in some problems random (false/erroneous)solution testing is helpful

in bigger problems it doesn't work well — Preceding unsigned comment added by 2A02:2149:8430:F500:24AD:8759:82E7:D432 (talk) 16:38, 9 February 2018 (UTC)[reply]

Analogue percentage of voltage through analogue logic gates and noise sharing through analogue logic gates

Analogue systems add more noise... but it's better than nothing. analogue logic gates allow interference phenomena to occur. You run it many times and all percentages of voltage per order of digit must be described by their lower order of significance digits (and some information is lost + we have noise) — Preceding unsigned comment added by 2A02:2149:8888:3A00:A10A:46B7:5026:A29A (talk) 00:30, 17 February 2018 (UTC)[reply]

we don't have 0 and 1 but values from 0% to 100% and when we have less than 100% the digit has to become zero and the lower ranked digits should express now with absolute zeros and ones the number
the initial percentaged analogue digits should communicate with analogue logic gates and a random generator which is activated sometimes, and an other random generator shuffles the percentages of entagled configuration vs pure noise
  • sometimes we have 40% entaglement and 60% noise, we shuffle though the modes of operation with another randomizer
  • the entaglement can be the same number, the opposite number, or any known standard logic operation

Dubious timeline

I find it odd that Yuri Manin is listed as the first person to propose the idea of quantum computation. The reference given is a 1980 book on (classical) computability which makes a brief mention (only in its preface!) of the possibility of using quantum effects for computation. Nowhere in the body of that book is quantum computing discussed. The idea of using quantum effects for computation was already articulated in Feynman's 1959 talk There's Plenty of Room at the Bottom:

Atoms on a small scale behave like nothing on a large scale, for they satisfy the laws of quantum mechanics. So, as we go down and fiddle around with the atoms down there, we are working with different laws, and we can expect to do different things. We can manufacture in different ways. We can use, not just circuits, but some system involving the quantized energy levels, or the interactions of quantized spins, etc.

— Richard P. Feynman, "There's Plenty of Room at the Bottom", American Physical Society (1959)

The current timeline is dubious and needs to be fixed. Feynman's 1959 talk is the earliest reference I know of (and is cited in the Britannica article on quantum computing), but there may be others. Would anyone care to comment/suggest other possible early references on using quantum effects in computing devices? Stablenode (talk) 17:01, 21 April 2018 (UTC)[reply]

Following up, an English translation of Manin's entire discussion of quantum computation in his 1980 book is given in Manin's own 1999 paper Classical computing, quantum computing, and Shor's factoring algorithm. It is literally 3 paragraphs long.

The following text is a contribution to the prehistory of quantum computing. It is the translation from Russian of the last three paragraphs of the Introduction to [Ma2] (1980). For this reference I am grateful to A. Kitaev [Ki]. “ Perhaps, for better understanding of this phenomenon [DNA replication], we need a mathematical theory of quantum automata. Such a theory would provide us with mathematical models of deterministic processes with quite unusual properties. One reason for this is that the quantum state space has far greater capacity than the classical one: for a classical system with N states, its quantum version allowing superposition accommodates c^N states. When we join two classical systems, their number of states N1 and N2 are multiplied, and in the quantum case we get exponential growth c^(N1*N2). These crude estimates show that the quantum behavior of the system might be much more complex than its classical simulation. In particular, since there is no unique decomposition of a quantum system into its constituent parts, a state of the quantum automaton can be considered in many ways as a state of various virtual classical automata. Cf. the following instructive comment at the end of the article [Po]: ‘The quantum–mechanical computation of one molecule of methane requires 10^42 grid points. Assuming that at each point we have to perform only 10 elementary operations, and that the computation is performed at the extremely low temperature T = 3.10^−3 K, we would still have to use all the energy produced on Earth during the last century.’ The first difficulty we must overcome is the choice of the correct balance between the mathematical and the physical principles. The quantum automaton has to be an abstract one: its mathematical model must appeal only to the general principles of quantum physics, without prescribing a physical implementation. Then the model of evolution is the unitary rotation in a finite dimensional Hilbert space, and the decomposition of the system into its virtual parts corresponds to the tensor product decomposition of the state space. Somewhere in this picture we must accommodate interaction, which is described by density matrices and probabilities.”

— Yuri Manin, "Classical computing, quantum computing, and Shor's factoring algorithm", https://arxiv.org/abs/quant-ph/9903008 (1999)


Peter Shor in his 2000 arXiv article (Introduction to Quantum Algorithms) has a very clear summary of where Manin's contribution lies; it certainly cannot be said that he was the first to suggest the idea of quantum computing:

In 1982, Feynman [19] argued that simulating quantum mechanics inherently required an exponential amount of overhead, so that it must take enormous amounts of computer time no matter how clever you are. This realization was come to independently, and somewhat earlier, in 1980, in the Soviet Union by Yuri Manin [30]. It is not true that all quantum mechanical systems are difficult to simulate; some of them have exact solutions and others have very clever computational shortcuts, but it does appear to be true when simulating a generic quantum mechanics system. Another thing Feynman suggested in this paper was the use of quantum computers to get around this. That is, a computer based on fundamentally quantum mechanical phenomena might be used to simulate quantum mechanics much more efficiently. In much the same spirit, you could think of a wind tunnel as a “turbulence computer”. Benioff [5] had already showed how quantum mechanical processes could be used as the basis of a classical Turing machine. Feynman [20] refined these ideas in a later paper.

— Peter W. Shor, "Introduction to Quantum Algorithms", https://arxiv.org/pdf/quant-ph/0005003 (2000)

I propose the timeline be changed to reflect Feynamn's earlier observations in the 1950s and make it clear that Manin's short observation in 1980 was articulating a motivation for pursuing this approach to computing. To claim that Manin spawned the field is ridiculous. Stablenode (talk) 19:57, 22 April 2018 (UTC)[reply]

Concrete proposals to improve this article

With the recent surge of interest in quantum computing, an increasing number of readers come to this article to learn what quantum computing is. But this article isn't as good as it could be, and I think it can be significantly improved. Maybe we can discuss some concrete proposals to improve the article? If we have reasonable consensus on some proposals, someone can go ahead and make changes. My suggestions are

  1. Merge the long Timeline section with the article Timeline of quantum computing and remove it from this article.
  2. More ambitiously, restructure the sections after the lede to clearly answer the questions a reader may have. I think the answer to many of these questions are already in the article, but some of them are hard to find. Some questions a reader may have:
    1. "What is a quantum computer?" (explain what it is)
    2. "Why do we want to build a quantum computer?" (describe applications and motivation for building one, examples of problems that can be solved exponentially faster on a quantum computer, etc.)
    3. "How are quantum computers made?" (discuss physical implementations, such as ion traps, etc., and the challenges in building a quantum computer)
    4. "Who is building quantum computers?" (Companies, universities, etc., perhaps with some examples. This could be part of the previous section, with examples of who is using which technology. E.g., Company X and Prof. Y at University Z are building quantum computers from ion traps.)
    5. "Where can I learn more?" (a short, curated further reading list with some guidance would be great; e.g., we should distinguish between popular science articles for the lay person, detailed overview articles for more interested readers, textbooks for undergraduates and grad students, etc.)
  3. We could follow the model of having a high-level encyclopedic overview in this article, and a more detailed explanation of what quantum computing is in another article, as done by quantum mechanics and introduction to quantum mechanics, and general relativity and introduction to general relativity (both of which are featured articles!).

That's all I can think of for now, but I'm sure others probably have concrete proposals to improve this article as well. Let's discuss them! — Robin (talk) 18:09, 20 May 2018 (UTC)[reply]

I don't understand this article at all

"The calculation usually ends with a measurement, collapsing the system of qubits into one of the {\displaystyle 2^{n}} 2^{n} eigenstates, where each qubit is zero or one, decomposing into a classical state.'

This is way too complicated for me, and I'm not exactly new to algorithms.

Please tell how quantum computing works in laymen's terms. Like: "For example, if quantum computers had enough bits you could encode the Travelling Salesmen problem for 5 cities like this, and after such and such trick, and measuring the qubit 1 to 5, by simply ordering the probabilities you'd have the perfect route."

Joepnl (talk) 00:10, 13 June 2018 (UTC)[reply]