Breaking News
Loading...
Monday 20 August 2012

Info Post
Well over 90% of visitors to threads such as this Stack Exchange question by Gerard 't Hooft asking why professional physicists consider his research in the last 10+ years to be wrong are either unwilling or incapable of rational thinking or ignorant about the basics of modern physics.

Pretty much every forum about fundamental enough physics questions on the Internet is completely overwhelmed by cranks and the market of popular books on quantum mechanics and newspaper articles isn't much different.

That's why one must be kind of happy to see every single person who hasn't lost his mind completely and who is still willing to "speak" in public. In this case, it's Peter Shor of MIT. He answered 't Hooft's question as follows:




PS: I can tell you why I don't believe in it. I think my reasons are different from most physicists' reasons, however.

Regular quantum mechanics implies the existence of quantum computation. If you believe in the difficulty of factoring (and a number of other classical problems), then a deterministic underpinning for quantum mechanics would seem to imply one of the following.
  • There is a classical polynomial-time algorithm for factoring and other problems which can be solved on a quantum computer.
  • The deterministic underpinnings of quantum mechanics require \(2^n\) resources for a system of size \({\mathcal O}(n)\).
  • Quantum computation doesn't actually work in practice.
None of these seem at all likely to me. For the first, it is quite conceivable that there is a polynomial-time algorithm for factoring, but quantum computation can solve lots of similar periodicity problems, and you can argue that there can't be a single algorithm that solves all of them on a classical computer, so you would have to have different classical algorithms for each classical problem that a quantum computer can solve by period finding.

For the second, deterministic underpinnings of quantum mechanics that require \(2^n\) resources for a system of size \({\mathcal O}(n)\) are really unsatisfactory (but maybe quite possible ... after all, the theory that the universe is a simulation on a classical computer falls in this class of theories, and while truly unsatisfactory, can't be ruled out by this argument).

For the third, I haven't seen any reasonable way to how you could make quantum computation impossible while still maintaining consistency with current experimental results.



LM: It's of course a type of an argument that uses tools that are characteristic for a man doing quantum computation, not fundamental high-energy or condensed-matter or other "real phenomena" physics, but I for one consider this concise answer totally valid and kind of indisputable.

Quantum mechanics implies that quantum computers may be constructed in principle and once they're constructed, they work. This conclusion is really a totally straightforward application of the verified laws that govern the behavior of properties of all physical systems in our quantum Universe – that govern all the quantum information around us. One just combines a thousand (\(n={\mathcal O}(1,000)\) – of degrees of freedom, a thousand of qubits, and he can already build devices that can break into the codes the world relies upon today within seconds, or much faster than that.

It is not really plausible that the laws we have verified for collections of several degrees of freedom, or quadrillions of degrees of freedom arranged in various ways etc., will fail for a thousand of qubits in a quantum computer. It's still true that the size (dimension) of the Hilbert space has to grow exponentially with the number of qubits and it's still true that the Hilbert space obeys the superposition principle. Deviations from quantum mechanics that would be able to cripple the correct quantum behavior of \({\mathcal O}(1,000)\) qubits would almost certainly produce measurable deviations and problems already when tested on several qubits and we know that no such deviations exist.

So if you avoid some science-fiction-like conspiracy theories that Nature is not only deceitful but literally evil and She is making it look like the laws of quantum mechanics work for one, two, or any number of atoms – as well as for their complicated macroscopic bound states containing trillions of trillions of atoms we have tested – you're bound to conclude that it must be possible in principle to create a gadget with just thousands (or millions, it doesn't really matter) of elementary building blocks that may crack these codes in a short enough time. Because we need this gadget to work just for thousands or millions of steps, it's enough to suppress the "decoherence per operation" and the "accuracy of operations" by a factor of million which is doable because it's far less dramatic than the super-exponential size of the numbers that may be factored.

But this conclusion has far-reaching consequences for the proposal that there's any fundamentally classical or deterministic or realist "mechanism" running behind the scenes. If this were the case, it would be possible to break the codes with a fundamentally classical computer, too – simply because a quantum computer may solve these problems via Shor's algorithm and related algorithms.

This sounds inconceivable. Even if you were ready to believe that those "super difficult" problems in computer science have a polynomially fast classical solution, in contradiction with the beliefs of most complexity computer scientists, it wouldn't be enough to make this unusual picture of computer science compatible with the assumption of realist underpinnings of quantum mechanics. Because all the quantum algorithms really work on the same principle – some kind of periodicity finding by a method that looks like a massive parallel computing – there would have to be a rather simple and unified classical algorithm that just emulates a quantum computer and solves all the difficult problems pretty much in the same way.

All these things are unimaginable. The classical computer would really have to be very simple. Because the basic idea behind Shor's algorithm is also simple, you can't rely on any hyper-complicated classical algorithms that would resemble Wiles' proof of the Fermat Last Theorem. The classical algorithm to factor large numbers would pretty much be bound to be composed of two steps – a simple enough gadget that emulates quantum mechanics; and Shor's or similar algorithm added on top of that.

If this were possible, it should be really simple to see how the "emulation" of quantum mechanics could look like.

The only conceivable classical solution that imitates quantum mechanics "faithfully enough" is one that remembers the whole wave function or the density matrix but gives them an invalid interpretation – treats them as classical observables – and is equipped with some extra "collapses" that guarantee that the wrong "wave function is a classical wave" interpretation is effectively converted to the right "wave function is just a probability (amplitude) wave" so that it doesn't conflict with the observations – because the fact that the wave function is just a template to get probabilities is something we directly observe in the experiments (although some crazy people are trying to turn it into a super-controversial mysterious assumption).

(Such collapse-based fakes of quantum mechanics have to be infinitely fine-tuned to agree with things like the Lorentz symmetry of all effects but they're still fundamentally wrong because in Nature, the wave function simply isn't an objective observable and this fact may be demonstrated and has been demonstrated.)

But if you adopt such a classical fake of quantum mechanics, the number of classical degrees of freedom – recall that the individual amplitudes in the wave function have become classical observables – will grow exponentially with the number of the qubits i.e. the number of the quantum degrees of freedom. This is really implausible as well. Both in classical field theory and quantum field theory, the experimentally verified Lorentz symmetry seems to imply that the number of elementary observables scales with the volume. If the quantum field theory is an effective quantum field theory including gravity, the true number of degrees of freedom grows even less quickly than that – it's proportional to the surface area (because of the holographic principle).

It seems implausible that you could design a theory that agrees with the precision tests of the Lorentz symmetry in so many contexts that would nevertheless imply that the number of degrees of freedom grows exponentially with the number of qubits in a quantum computer. And you may verify that none of the proposed theories (or classes of theories) intended to fake quantum mechanics can achieve such a goal. Such a dramatic superlinearity would contradict not just the apparent absence of actions at a distance; it would really conflict with the extensiveness of ordinary materials such as air and water, too (and other basic facts).

The logic is really waterproof and any proponent of a fundamentally "realist" description of quantum mechanics must choose one of Shor's options. Incredibly enough, Gerard 't Hooft chooses the "quantum computers can't work" option which is almost certainly the least plausible option among the three implausible ones. (Of course, I've known about this insane claim by 't Hooft for many years.)

A quantum computer is just a clever piece of applied maths or engineering. It combines many "qubits", quantum degrees of freedom whose behavior has been verified separately or in small groups or in special large groups (but very accurately), and local steps that may be applied to the state of these objects (e.g. electrons' spins). What could possibly go wrong that could prevent the quantum computer from working but you wouldn't notice the disease in either of the experimental tests that quantum mechanics has been passing flawlessly for almost a century? It just makes no sense whatsoever.

If you have \(n\) qubits, the dimension of the Hilbert space is \(2^n\). It has to be so because the qubits are allowed to be in independent states from others. If a group of 1,000 electrons were already forced by Nature to adjust their behavior according to some collective properties, it would mean that Nature includes a very blatant action at a distance. If this were the case, we would have already observed telepathy, telekinesis, voodoo, and other things. Imagine that Nature was secretly imposing quotas on the number of electron spins in a region that can be up. Such quotas would instantly lead to violent violations of the angular momentum conservation law, and so on. While my sister who just returned from the Summer Tarot School for Beginning Witches ;-) surely thinks that such phenomena are commonplace, I think that people familiar with basics of physical sciences realize that they have never been observed despite many somewhat sophisticated attempts to do so.

You could invent other "toy models" in which Nature doesn't remember an exponentially large number of the complex amplitudes for 1,000 qubits. For example, Nature secretly chooses a basis and only remembers 1 trillion of the largest complex amplitudes (in absolute value) among all the amplitudes while the rest is set to zero after each Planck time, and the whole wave function is renormalized. Imagine anything of the sort and think of the consequences. You would see that any such unnatural interventions would immediately lead to consequences – spectacularly observable new "supernatural" effects. All these interventions would be nonlocal effects that may easily become macroscopic if they operate in macroscopic situations. For example, because the largest amplitudes would likely be either positively or negatively correlated with a large energy, the elimination of the smaller or larger amplitudes would lead to an increasing or decreasing energy in average: a perpetual motion machine would become possible. And so on, and so on. There's just no way to avoid these things.

The excuse that there are many degrees of freedom isn't a license for you to mess with the laws of quantum mechanics. Quantum mechanics has been tested not only for a few degrees of freedom; its numerous predictions have been verified for trillions of trillions of degrees of freedom, too. Only if quantum mechanics produces the right classical limit for macroscopic bodies, you may argue that it inherits the agreement with the experiment from its classical predecessor. Any selective filtering or messing with the quantum information in the case of many qubits – something needed to "kill" the quantum computers – would imply that quantum mechanics no longer has the right classical limit, so it fails even in tests that were done before quantum mechanics was born. Everyone who suggests that quantum mechanics has only been verified for 2 or 3 spins etc. misunderstands quantum mechanics in the same sense as those who say that physics only works for the motion of planets but it says nothing about phenomena that matter to humans (where witchcraft takes over).

There can't be any constraint that would prevent 1,000 qubits from being allowed to try all \(2^{1,000}\) states. The holographic limitations of quantum gravity are closest to this possibility but they may only become relevant when you see that they're relevant – when the matter is too dense and collapses into a black hole. When you don't see anything this spectacular, it's just impossible for electrons' spins to routinely deviate from their quantum-mechanics-predicted behavior.

Now, the quantum computer is capable of doing several basic operations on localized groups of the qubits. Those things have really been tested in isolation; the ability of these groups to work is physically equivalent to many tests of the laws of quantum mechanics that have already been performed.

An imitation of quantum mechanics that agrees with the normal experiments that verify quantum mechanics but that also miraculously bans quantum computers at the same moment is exactly as crazy as a theory of classical computers that suddenly prevents you from connecting a GPU with a microprocessor. It can't happen. The denial of the existence of quantum computers is the complete and full-fledged denial of all of quantum mechanics. I am amazed by the people such as Mr Ron Maimon who are ready to say that they really don't deny anything about quantum mechanics at all, it's just a straw man, they say: they "only" deny quantum computers, the superposition principle, the existence of entanglement (at least when there are many degrees of freedom), unitarity of the evolution and every other postulate of quantum mechanics, and every single consequence of quantum mechanics (perhaps unless it may be verified directly in their kitchen, while they're allowed to forget everything they've learned previously).

It's a stunning amount of incoherence and stupidity.

Why don't they test their ideas against at least a single and simple situation, a collection of 3 qubits, the ammonia molecule, anything else? They would see as clearly as I have when I did these tests that the status of all these "theories" is exactly on par with telepathy. They just don't copulating work at all.

And that's the memo.

0 comments:

Post a Comment