Breaking News
Loading...
Thursday 23 August 2012

Info Post
Two days ago, Scott Aaronson was rightfully confused by some bizarre statements in the literature on "superqubits" and he asked:

Two very intriguing papers recently appeared on the arXiv, claiming that one can use "superqubits" -- a supersymmetric generalization of qubits -- to violate the Bell inequality by more than standard quantum mechanics would allow. (That is, they claim one can violate the Tsirelson bound, which says that the CHSH game can be won quantum-mechanically with probability at most \(\cos^2(\pi/8) \sim 0.85\).) The first paper is by Borsten, Brádler, and Duff and the second is by Brádler. (LM: Kamil Brádler is a quantum information theorist trained at my Alma Mater in Prague.)

Alas, I remain deeply confused about the physical meaning of these results, if any. As the authors define them, "superqubits" seem to involve amplitudes that can be Grassmann numbers rather than just complex numbers. While I know next to nothing about the topic, that seems like a fundamental departure from "supersymmetry" in the sense that high-energy physicists use the term! I take it that supersymmetry is "just" a proposed new symmetry of nature, alongside the many other symmetries we know, and doesn't involve tampering with the basic rules of quantum mechanics (or with spatial locality). In particular, in supersymmetric theories one still has unit vectors in a complex Hilbert space, unitary transformations, etc.

If that's correct, though, then what on earth could superqubits have to do with supersymmetry in physics---besides perhaps just repurposing some of the same mathematical structures in a totally different context? Is there any possibility that, if nature were supersymmetric in such-and-such a way, then one could do an actual experiment that would violate Tsirelson's bound?





Your humble correspondent answered:

I completely agree with Scott that this particular "Grassmannization" isn't equivalent to what supersymmetry is doing in physics. Supersymmetry is a constraint that picks a subset of theories – ordinary theories with ordinary bosonic and fermionic fields that are just arranged (and whose interactions are arranged) so that there is an extra Grassmann-odd symmetry. Because supersymmetric theories are a subset of more general theories, of course that all the general inequalities that hold for the more general theories hold for supersymmetric theories, too. And there are many new, additional inequalities and conditions that hold for supersymmetric theories – but not fewer constraints.

In supersymmetric theories, what becomes Grassmann numbers are never probability amplitudes. Only particular observables are fermionic operators – operator counterparts of Grassmann-number-valued quantities in classical physics. These fermionic operators only have nonzero matrix elements between Grassmann-odd states and Grassmann-even states; for the same reason why bosonic operators only have nonzero matrix elements between states of the same grading. One may introduce a grading on the Hilbert space but the amplitudes are still complex commuting \(c\)-numbers.

If a basis of the Hilbert space has Grassmann-even as well as Grassmann-odd states (e.g. states with an even or odd number of fermionic excitations), then the actual state in which the system may be found is either Grassmann-even or Grassmann-odd which means that it is only composed of basis vectors of the same kind. Mixing of these two types of basis vectors isn't allowed; that's what the grading (or, equivalently, the superselection rule for bosons and fermions) means. All the coefficients of the wave function in front of the basis vectors are still ordinary complex commuting \(c\)-numbers.

There's a simple reason why probability amplitudes can't be Grassmann numbers. To get physical commuting quantities out of Grassmann numbers, one always has to integrate. That's why the Grassmann variables may be integration variables integrated over in Feynman's path integral; but that's also why they have to be set to zero if we're doing classical physics. There aren't any particular nonzero values of Grassmann numbers. On the other hand, probability amplitudes don't have to be integrated; their absolute values should be just squared to obtain the probabilities (or their densities such as differential cross sections).

So if their construction is consistent at all, it's just a mathematical analogy of superspaces at a different level – amplitudes themselves are considered "superfields" even though in genuine quantum physics, amplitudes are always complex numbers. That's why the inequalities can't be considered analogous to Bell-like inequalities and can't be applied to real physics. In particular, once again, Tsirelson's bound can't be violated by theories just because they're supersymmetric (in the conventional sense, just like the MSSM or type IIB string theory) because it may be derived for any quantum theory, whether it is supersymmetric or not, and supersymmetric theories are just a submanifold of more general theories for which the inequality holds.

I would point out that it wouldn't be the first time when Michael Duff and collaborators would be giving wrong interpretations to various objects related to quantum computation. Some formulae for the entropy of black holes mathematically resemble formulae for entangled qubits etc. But the interpretation is completely different. In particular, the actual information carried by a black hole is \(A/4G\) nats i.e. the black holes roughly parameterize an \(\exp(A/4G)\)-dimensional space of microstates. That's very different (by one exponentiation) from what is needed for the quantum-information interpretation of these formulae in which the charges themselves play the role of the number of microstates.

So I think that at least Michael Duff has been sloppy when it came to the interpretation of these objects which was the source of his misleading comments about the "black hole entropy formulae emulating tasks in quantum computation". There may be mathematical similarities – I am particularly referring to the Cayley hyperdeterminant appearing both in quantum computing and black hole entropy formulae – but the black holes aren't really models of those quantum algorithms because their actual Hilbert space dimension is the exponential of what it should be for that interpretation and they're manipulating pretty much all the qubits at the same moment. The objects in the hyperdeterminant have completely different interpretations on the string theory and quantum computing side; there isn't any physical duality here, either.

Let me return to the superqubits. Normal quantum mechanics realizes all "bosonic Lie group" transformations you may think of – such as rotations or translations – as elements of \(U(N)\) acting on the Hilbert space; that's true for all quantum field theories we know and string theory, too. You could think that there could be a "natural" extension where \(U(N)\) is replaced by \(U(M|N)\), a supergroup. Similarly the normalization condition for the wave function "could" include squared fermionic amplitudes.

The first "novel" idea is partly possible: generators of supersymmetry indeed correspond to operators that map bosonic states to fermionic ones and vice versa. However, to get an "actual finite supersymmetry transformation", you need to consider objects of the type\[

\exp(\theta_\alpha Q^\alpha-\text{h.c.})

\] in which the generator \(Q^\alpha\) is multiplied by a Grassmann-odd coefficient \(\theta_\alpha\). Again, there are no "particular nonzero values" of Grassmann-odd numbers so you can't talk about any "particular transformation" of this form. In particular, the evolution in time can never map bosonic states to fermionic ones and vice versa. You must view the generators of supersymmetry as fermionic operators that map particular bosonic states to particular fermionic states but the exponential above can't be constructed because there are no "actual values" of \(\theta_\alpha\) that you could insert. The finite exponential is just an abstract construct designed to look analogous to similar exponentials of bosonic operators but there's still a difference, namely that the bosonic numbers take values in a particular set while the fermionic ones don't.

Equivalently, we may observe that quantum mechanics makes it possible to learn the initial state of a physical system – everything about it – up to an overall phase (or normalization). This couldn't be the case if some amplitudes were allowed to be fermionic because Grassmann numbers can't really be "measured" by any apparatus or their values can't be "pronounced" (except for zero).

Effectively, these superqubit folks conclude that they may violate universal inequalities for quantum mechanics because they allow certain objects – probability amplitudes – to take values of an entirely different form than what is allowed in quantum mechanics. In practice, it has the same effect as if you assume that \(|c|^2\) may be negative for a complex number (it can't) which is the ultimate reason why they think that the winning probability may be higher than what is allowed by quantum mechanics.

This is actually more than just an analogy. The actual role of the Grassmann-odd amplitudes is that there is an extra term in \(\braket\psi\psi\) which is equal to \(i\theta_1\theta_2\) or \(\theta^*\theta\), a product of two Grassmann variables, replacing \(|c_i|^2\). But given one possible arrangement of this sort, the other arrangements are obtained by rescaling \(\theta_1,\theta_2\) by the inverse factors \(k\) and \(1/k\), respectively, so this extra term in the overall probability is really equivalent to the bosonic \(xy\) which isn't positively definite; if subsystems were described by superqubits, they could be created with negative probabilities. The feeling that \(\theta^*\theta\) could be positively definite is an illusion because these two Grassmann variables must actually be independent; there's no way to impose the usual reality condition on one Grassmann variable.



There are no superqubits but there are superorganisms. ;-) A community of ants (superorganism) together with several human collaborators built a sophisticated system of routes and tunnels out of concrete in the soil. It's the equivalent of the Great Wall of China but it's much more structured and fractal than the Chinese counterpart. The human society is doing similar things although the algorithms are less "hardwired" in the heads of the humans. However, it's essential for the human progress that many people work "outside the system", as individuals.

0 comments:

Post a Comment