In 2010 Scott Aaronson and Alex Arkhipov posted a 95 page preprint on the quant-ph arxiv arguing that that linear optical interferometers cannot be efficiently simulated on classical computers. The argument is couched in quantum computational complexity theory, which can be daunting to those not skilled in that art.

Independently in 2011 our group was led to a similar conclusion in our attempts to predict the output of a specific linear optical interferometer we call the quantum pachinko machine. In light of the Arkhipov and Aaronson result we have gone back to investigate the complexity of this specific device and we have a new paper that explicitly constructs the Hilbert state space of the machine and shows that its dimension grows exponentially with all resources needed to build the device.

Our argument is only about 4 pages long and follows the logic of Feynman — If the Hilbert space is exponentially large then it is unlikely that a classical computer can simulate such a machine. We don't invoke any elements of quantum complexity theory in our argument, which may come as a relief to some physicists. The abstract of our new paper I present here:

Thus such linear optical interferometers, particularly with arbitrary Fock-state inputs, are likely not efficiently simulatable on a classical computer. The blow up in the Hilbert space is directly connected to the bosonic Fock states in the input. This comes as a bit of a surprise to us in the quantum optics community where the conventional wisdom had been that any linear optical device is always efficiently simulatable regardless of the input. To quote one of our co-authors, Hwang Lee, "This is shocking!"

Hence such linear optical interferometers, with their exponentially large Hilbert space, meet Feynman's necessary criterion for a universal quantum computer. Whether it is a universal quantum computer or not is an open question.

However, large-number photonic Fock states are highly susceptible to loss and dephasing, much more so than Gaussian states, and so whatever this thing is it is not a

We heartily agree with their assessment in our new paper.

Independently in 2011 our group was led to a similar conclusion in our attempts to predict the output of a specific linear optical interferometer we call the quantum pachinko machine. In light of the Arkhipov and Aaronson result we have gone back to investigate the complexity of this specific device and we have a new paper that explicitly constructs the Hilbert state space of the machine and shows that its dimension grows exponentially with all resources needed to build the device.

Our argument is only about 4 pages long and follows the logic of Feynman — If the Hilbert space is exponentially large then it is unlikely that a classical computer can simulate such a machine. We don't invoke any elements of quantum complexity theory in our argument, which may come as a relief to some physicists. The abstract of our new paper I present here:

"Aaronson and Arkhipov recently used computational complexity theory to argue that classical computers very likely cannot efficiently simulate linear, multimode, quantum-optical interferometers with arbitrary Fock-state inputs [S. Aaronson and A. Arkhipov, arXiv:1011.3245]. Here we present an elementary argument that utilizes only techniques from quantum optics. We explicitly construct the Hilbert space for such an interferometer and show that that its dimension scales exponentially with all the physical resources. Finally we also show that the Schrödinger and Heisenberg pictures of quantum theory, while mathematically equivalent, are not in general computationally equivalent."

Thus such linear optical interferometers, particularly with arbitrary Fock-state inputs, are likely not efficiently simulatable on a classical computer. The blow up in the Hilbert space is directly connected to the bosonic Fock states in the input. This comes as a bit of a surprise to us in the quantum optics community where the conventional wisdom had been that any linear optical device is always efficiently simulatable regardless of the input. To quote one of our co-authors, Hwang Lee, "This is shocking!"

Hence such linear optical interferometers, with their exponentially large Hilbert space, meet Feynman's necessary criterion for a universal quantum computer. Whether it is a universal quantum computer or not is an open question.

However, large-number photonic Fock states are highly susceptible to loss and dephasing, much more so than Gaussian states, and so whatever this thing is it is not a

*fault tolerant*quantum computing machine. Error correction of some sort will need to be deployed to show that this is useful. Nevertheless, as Arkhipov and Aaronson note, linear optics with bosonic inputs is an interesting playground in quantum information processing.We heartily agree with their assessment in our new paper.

Hi Jon,

ReplyDeleteYou seem to base your criteria for what's classically simulatable entirely on the size of a system's Hilbert space ("If the Hilbert space is exponentially large then it is unlikely that a classical computer can simulate such a machine"). This is not a very strong argument. In fact it's wrong. I could apply this reasoning to argue that a Hadamard transform applied to n separable |0> qubits yields a superposition with 2^n elements. Or I could argue that an arbitrarily large quantum computer on n qubits comprising only Clifford operations yields an exponentially large Hilbert space. Yet we know that such systems are easy to classically simulate. Feynman may be right that this is a necessary criteria for quantum hardness, but it is most definitely not a sufficient one. "Hence such linear optical interferometers, with their exponentially large Hilbert space, meet Feynman's sufficient criterion for a universal quantum computer" - wrong, it's not a sufficient condition, it's a necessary one! You've proven a necessary condition, but not the sufficient one. This same necessary condition can be proven for a vast array of quantum systems that are trivial to classically simulate. Think about tensor network states - exponentially complex, easy to classically simulate. The reason why bosonic Fock states are hard to simulate is not just because of the exponential number of terms in the superposition, but because of the relation between the output amplitudes and matrix permanents, which are classically hard to calculate. In the case of fermions the amplitudes are related to matrix determinants, which are easy to classically calculate.

Your arguments about the easiness of simulating coherent states are spot on.

I'd appreciate your thoughts.

Cheers,

Peter.

Peter:

ReplyDeleteIt is a typo.

;-)

We say "necessary condition" in the paper. I corrected the blog. My first blog post ever but I had a Homer Simpson moment -- DOH!

We are aware that this is not a very strong argument for people in quantum computing but it does seem to be a convincing argument for those in quantum optics, which is why we wrote the paper. I understand the business about the interferometer computing the permanent is the complexity reasoning as to why this is hard but I wanted to get into the guts of the thing and see just what was going on from a physics point of view. Interestingly, I knew that the permanent was computationally hard long ago and as early as 2000 Colin Williams and I conjectured that a bosonic interferometer would not be efficiently simulatable for this reason but we never wrote it up. As I mention in the blog when I show this result to complexity theorists they are not that impressed but the quantum opticians are shocked. Also we are not trying to compute the permanent but just predict the output of the interferometer from a physics point of view. The fact that we can't do that I think is interesting to the quantum optics community but perhaps less so to the quantum computing community. Finally there is this bit about the Schrödinger and Heisenberg pictures having different complexity scalings that I find very interesting.

I remember when I first learned that n qubits with only Clifford operations was efficiently simulatable I was perplexed and my take on that was that although the n qubits span a 2^n dimensional Hilbert space the Clifford gates only allow you to access a polynomial-sized subspace of that but perhaps that is even wrong. Last thing I want is to get into a fight in the quantum blogosphere with Scott Aaronson. In the end very few of my physicist friends have even read Scott's paper, much less understood it, and so I think our result is cute in that in provides some physical insight beyond stating "the device is computing the permanent of a matrix and that is hard."

Best wishes,

Jon

This comment has been removed by the author.

ReplyDeleteHi Jon,

ReplyDeleteThanks for your thoughts.

A minor point of clarification - boson-sampling does *not* let you compute the permanents of matrices. Each amplitude in the output configuration encodes a different matrix permanent. To calculate one particular matrix permanent you would have to measure one amplitude with precision. But there are an exponential number of them, which would require an exponential number of measurements, which obviously isn't efficient. So boson-sampling doesn't let us calculate matrix permanents, rather it stochastically samples across many different matrix permanents. If it were true that boson-sampling let us calculate permanents, this would imply #P<BQP, which is an even stronger statement than "quantum computers can solve NP-complete problems". not likely!

Cheers,

Peter.

I thought my ex-supervisor and current supervisor was about to have it out!

ReplyDeleteHi Peter:

ReplyDeleteThanks for the clarification. Perhaps that is why Colin and I never wrote anything up! We never could quite show it was possible to compute the permanent.

I am the first to clarify that I'm not in any way an expert in quantum complexity theory. Hence when I see statements like "#P<BQP" I am thrown into a coma where I think I'm back in a NASA program manager meeting where the number of acronyms per square meter in a PowerPoint presentation exceeds even that in a talk on quantum complexity.

;-)

Nevertheless I know know enough to get into trouble, which is clear from this discussion, but I have a thick skin. I think I learn through constructive dialogue and I have never been shy about making a fool of myself in public. I did understand enough of Scott and Alex's paper to realize their claim was that there was something very special about linear optical interferometers with bosonic Fock-state inputs.

We had also arrived at this conclusion from the point of view that we could not predict the output of such a device even for small photon numbers and small number of modes. We thought it might be constructive, particularly for those of us in quantum optics not skilled in the arcane arts of the complexity zoo, to provide some sort of intuitive physical argument in support of Scott and Alex's conclusion.

To many of us in quantum optics, the fact that the Hilbert space of this thing blows up exponentially fast as a function of the number of modes and photons number is shocking and provides to us that intuitive, if not rigorous, understanding as to what the heck is going on.

Keith I'm sure this discussion is quite civil and will be settled at the pub when I arrive in July. If not through lengthy discussions and many schooners of beer, then by an arm-wrestling competition between Peter and me that I'm sure you will be happy to judge impartially.

Cheers,

Jon

This comment has been removed by the author.

ReplyDeleteKeith, if there's an arm-wrestle between Jon and me, you know who has to be declared winner, right? right!!!?

ReplyDelete...that's right, Jon does.

ReplyDeleteHi Jonathan. For whatever it's worth, I completely agree with Peter. Exponentiality of the Hilbert space is NOT evidence for the hardness of classical simulation; it's only a very weak sufficient condition. And if an argument based on exponentiality of the Hilbert space were "convincing" to quantum optics people, then the quantum optics people would be *mistaken* to be convinced -- not a cultural difference, just a provable fact.

ReplyDeleteThe easiest way to see this is that the Hilbert space is also exponential for non-interacting fermions -- yet those can be simulated in classical polynomial time. So then, what is it that makes bosons hard to simulate and fermions easy to simulate, while both have exponentially-large Hilbert spaces? I don't see any way to answer that question without talking about the fact that the permanent is #P-complete while the determinant is in P. I don't think my and Alex's argument is actually that "daunting," but at any rate, our use of complexity theory was not a dispensable frill: it's the only way I know of to get evidence that the conclusion is actually true.

Sorry, sufficient condition --> necessary condition

ReplyDeleteHi Scott:

ReplyDeleteWe expected that the quantum complexity community would not really be enthralled with this paper. That's why we did not write it for that audience. I liken trying to talk with complexity theorists to trying to speak Italian when I'm in Italy. I don't know much Italian but in Italy people generally expect you to speak Italian and so I bumble along and make lots of mistakes and, usually, the Italians are thrilled that I'm making an attempt and correct me as I go along and after my trip I know a bit more Italian than I did before!

Cheers,

Jon

PS: I forgot to mention we do discuss the case of Fermions briefly in our paper. The difference, since only one Fermion can occupy one mode at a time, Fock states of the form |N> can only exist if N=1 and we show that at at least for non-interacting Fermions in a linear interferometer that the output can be efficiently calculated and there is no blow-up in the Hilbert space at least as per Eq. (4) taking N=1. Our goal was not to establish truth, you and Alex have done that, but rather it was to provide some physical insight as to what was going on. That I think we have done.

ReplyDeletePeter:

ReplyDeletePerhaps we should choose a more impartial referee for the arm wrestling.

Jon

Jonathan:

ReplyDeleteSorry, I can't resist expanding a little on your fun analogy. Imagine me trying to convince an audience of non-Italians that I visited Italy, and could even teach THEM some Italian, by walking around shouting:

"Ravioli! Rigatoni! Marinara! Bonjour!"

in an exaggerated Italian accent. (Yes, the "Bonjour" was intentional.)

How should an Italian---or for that matter, the non-Italians in my audience---react? How about: "We're absolutely thrilled to see your genuine interest in Italy, and in Italian cuisine and culture! So please, go back to Italy for another visit and study some more!"

Also, I looked again at your paper, and I don't think your argument about fermions is correct. If you have n identical fermions in m modes, the number of possible states is (m choose n), which grows exponentially as m and n get large, exactly as it does in the bosonic case. The fact that you can only have 1 fermion per mode is irrelevant here (it decreases the exponential, but only slightly, from (m+n-1 choose n) down to "merely" (m choose n)). Again, to see the difference between n bosons and n fermions, I think you have to talk about the complexity-theoretic difference between the n*n permanent and the n*n determinant.

ReplyDeleteHi Scott:

ReplyDeleteThat is remarkably close to what I do when wandering around in Rome except that I replace "Bonjour" with "Vino" and the end result is that I'm always well fed and also slightly inebriated and just after a few glasses of pino grigo I even begin to think I understand some quantum complexity theory vis-à-vis Dowling's Many Beer Interpretation of Quantum Mechanics .

As for the identical fermions, I like your statement "...you have to talk about the complexity-theoretic difference between the n*n permanent and the n*n determinant." It reminds me of an anecdote (as told to me by John Wheeler). Bohr was lecturing on the foundations of quantum mechanics and declared at one point, "You must always express the measurement apparatus in classical terms!" In response to this Wigner, seated in the front row, leaped to his feet and in his high-pitched voice called out, "What will happen to me if I don't!?" With that aside I will think more about identical fermions vs. identical bosons.

Cheers,

Jon

On a related note: "Horoscope for April 17, 2013: ARIES -- The more progressive you are, especially when dealing with intangibles, the luckier you'll get. The same cannot be said if you bog yourself down in traditional methods."

ReplyDeleteLOL about the "Many Beer Interpretation" and wandering around Rome saying "Vino"!

ReplyDeleteIn this case, "what will happen to you if you don't" is pretty simple: you'll make absurd predictions, that efficient classical algorithms are "unlikely to exist" even in situations where they plainly do exist. Indeed, Alex Arkhipov pointed out to me that there's a simpler example than identical fermions, which makes the point even more strongly. Namely, n CLASSICAL particles distributed among m bins! Those will also have exponentially many possible configurations. But obviously, that doesn't prevent them from being simulated in classical polynomial time.

Hi Scott:

ReplyDeleteI have made a career of making absurd predictions. The fermion case is more interesting than I thought! To quote Einstein, "I dink I vill have a little dink."

Jonathan

Hi Scott:

ReplyDeleteYou are right about the fermions. Our original goal in this paper was to see why bosonic Fock states are different then other bosonic optical states such as coherent or squeezed states and the Hilbert dimension argument provides that difference. We added the fermion comment as an afterthought but not correctly as you point out and so yes a simple counting argument gives the dimension of the fermionic system Hilbert space as Binomial[N,2L] (not Binomial[2L,N]), where N=n and 2L=m in your notation, subject to the constraint N≤2L to insure there are never more than one fermion in a mode.

The 'computationally complex' regime in our lingo is when n=N=m/2=L and then the HS dimension scales as 2^(2N) just as in the bosonic case. With other bosonic states such as squeezed and coherent out of the way we can begin to think about what is so special about bosonic Fock states vs. fermionic Fock states.

One important additional constraint, that brings me back to your description of these devices, is that the state of the interferometer must at all times be symmetric under particle interchange for bosons and antisymmetric for fermions.

Almost any input state for bosons you might think of is symmetric. For example in my 6x6 system in my Fig. (1) the input |111111>, one boson in each input, is perfectly fine. This is not an acceptable state at all for fermions and so this is not a valid input. Such a state cannot exist as it is not antisymmetric under two particle interchange!

To find the set of all possible antisymmetric states, to characterize and predict the outcome of the fermionic interferometer, I use the Slater determinant method. Hence I can compute the output for a given input in spite of the fact that Hilbert space is exponentially large since the determinant is computationally easy to carry out.

For bosons I would use the "Slater permanent" to find the acceptable symmetric states for the state of the interferometer. Even for a fixed input I cannot in general compute the output because the permanent is computationally hard.

Perhaps this is what you have been saying all along but I had to put it into my own language to really understand it.

Cheers,

Jon

I don't see how |111111> is not a valid input state for fermions? This just corresponds to putting just a single fermion at each mode. Surely that's perfectly legit? I agree that |22222> is not valid because of the commutation relation, but |11111>?

ReplyDeletespecifically, the commutation relation for fermions is {a_i^\dag,a_j^\dag}=0. Thus a_i^dag^2 = 0, so we can't have |2>. but this doesn't rule out |11>.

ReplyDeleteHi Peter:

ReplyDeleteAre we having fun yet?

;-)

I'm very much enjoying this discussion and I'm learning a lot.

Let us take just the input |1>|1> in the upper-most beam splitter in Fig. (1) where each of them has the same spin state. When we write a Fock state like this we imply it is in a plane wave configuration and so at the beam splitter, in order to interfere, the two fermions will be indistinguishable there and have unit overlap. At that point then the symmetrization postulate demands that the two-particle wave function must anti-symmetric under particle label interchange -- the origin of fermionic statistics -- but |1>|1> is not if both are say spin up and so the state can not be constructed experimentally in the first place.

You would have to prepare them in an entangled spin singlet if you insist on overlap at the beam splitter to give rise to interference. You could send them in in different modes, time binning them, and then it would be fine as they would then by distinguishable by time bin but you'd not get interference then at the beam splitter and no interferometer.

A good reference would be:

Title: Fermion and boson beam-splitter statistics

Author(s): Loudon, R

Source: PHYSICAL REVIEW A Volume: 58 Issue: 6 Pages: 4904-4909 DOI: 10.1103/PhysRevA.58.4904 Published: DEC 1998

Generalizing upon this the multi-fermion wavefunction must be anti-symmetrized at the input. The procedure for doing this is to construct the Slater determinant from the individual wave functions of each particle which is then how we would predict the consequent interferometer output. Since determinants are computationally easy (by row reduction) we can do this efficiently.

Now for bosons we luck out since the natural input state |1>|1> with both bosons in the same spin state is symmetric under particle interchange and satisfies the symmetry postulate automatically. However the most general bosonic input would be a multi-particle state that was completely symmetric under pair-wise particle interchange.

This wave-function can be computed by taking the Slater permanent of the individual particle states. However the permanent is not computable by row-reduction and one must resort to something like the Laplace decomposition which has an exponentially large number of steps in the size of the matrix.

Cheers,

Jon