### Rasetti: Quantum Computers based on TQFT

#### Posted by Urs Schreiber

Today’s sessions in Vietri are about quantum computation. Most of them go into more and different detail than I currently care about, but one had an interesting provocative theme related to high energy physics.

M. Rasetti is a theorist from Torino university. He says he is working on trying to figure out if quantum computation (QC) is fundamentally stronger than classical Turing-like computations. In other words, if QC can solve problems which are not in P in polynomial time.

I know that there is a meme floating around some brains which says that if you had a quantum system goverened by a Hamiltonian whose discrete eigenvalues are a sufficiently hard to compute function of the integers you could, by measuring these eigenvalues in a suitably prepared system, essentially compute this function and hence solve not-P problems in polynomial time.

This argument I always found rather less than convincing. It suffices to note that by precisely the same argument you could claim that a classical analog computer may compute NP problems easily: Just partition classical phase space into a denumerable number of subsets and pick a system whose energy is a non-computable function of the ordinal associated to each such subset. Up to some smoothness issues this is precisely the analogous idea as above – and clearly not a reasonable way to get a stronger-than-classical computer.

What Rasetti is talking about sounds much more sophisticated – though in the end it seems to reduce to more or less the above argument. Or does it?

So the idea is that there may be more types of quantum computers than currently considered in the standard literature. In particular, so Rasetti’s idea, using systems that have to be described by quantum *field* theory one might be able to (in principle) construct new and more powerful types of quantum computers.

This idea apparently goes back to Michael Friedman, who first proposed that non-abelian topological quantum field theories might exhibit features necessary so support a model of computation capable of solving not-P problems in polynomial time?

Apparently Friedman, Kitaev and Wang in particular proposed that Chern-Simons theory is a promising candidate.

I have not yet managed to talk to Rasetti and ask him some questions about his work, but if I understood correctly what he said in the talk the idea is:

The computation of non-abelian holonomy as well as that of Jones polynomials is not in P. Since the observables of CS theory are precisely these quantities all we have to do is to set up a system which is goverened by CS theory, somehow prepare it to be in a given knot state and then observe its observables. Since these will be not-P functions of these knots, we effectively have an analog quantum computer which computes not-P problems easily.

Does that make sense? Isn’t it precisely the same idea as mentioned at the very beginning, that given any system whose observables are a not-P function of its states it superficially looks like a device that computes not-P problems in polynomial time.

I have the feeling that something is wrong here.

I also feel that there should be a high-brow theoretical answer for what is wrong or why this is not wrong and am surprised that among QC-theorists there apparetly is no clarity about this.

So for me the conclusion of Rasetti’s talk is currently just that if you are willing to be interested in systems whose observables are non-P funtons of their states, then some quantum field theories in general and Chern-Simons theory in particular suggst themselves as laboratories since they do enjoy this property naturally.

**Update:** After rereading what I wrote here it now seems to me that the issue is related to the general question in as much we want to regard analog computers as computers when it comes to complexity issues. I am not a computation theorist so what I am saying here will sound crude to those who are, but let me say it anyway:

Clearly when talking about computational complexity one has to remove analog computers from the picture somehow. For instance protein folding is a famouns hard problem. It remains hard, even though I can figure out how a protein folds easily by just synthesizing it and seeing how it folds. That’s an analog computation and I bet it scales linearly with the size of the protein. You might rather want to call it an experiment, though.

And that’s the point, I think. It’s an experiment rather than a computation. And the same is true for the Chern-Simons computer proposed by Rasetti et al. They propose to make experiments in field theory and regard the measurement of the outcome as a calculation.

For practical purposes that may be fine. If I really want to know the value of some Jones polynomial and I really find it easier to come up with a system governed by CS theory, prepare it suitably and measure its observables somehow with sufficient accuracy, than to compute the polynomial on a digital computer up to that accuracy then that’s fine I guess.

But

a) this has little to do with quantum computation and is all about analog computers, and

b) it does hence not show that quantum computers are inherently more powerful than classical ones.

I assume the problem is related to the fact that we really want to study *universal* computers, not just special-purpose ones. I believe the problem is to figure out if a *universal* quantum computer is more powerful than the universal Turing machine.

## Re: Rasetti: Quantum Computers based on TQFT

Maybe I can help clarify a bit here.

It is true, of course, that one can define an observable for particular TQFT which is the Jones polynomial. Calculating the Jones polynomial is very hard (it’s in a complexity class called #P). So Freedman’s original idea was that TQFTs might be powerful enough to help us calculate the Jones polynomial.

But this logic (that there is an observable whose expectation value is classically hard to calculate and therefore we can just use the quantum system to get this expectation vallue) doesn’t work. Why? Because what you get from running the quantum experiment is not the expectation value, but a particular outcome which, when averaged over many runs yields the expectation value. In computer science jargonese you would say that you have an algorithm for “estimating” the value of the Jones polynomial. Now this in and of itself is also interesting: how does a quantum systems estimate compare with classical algorithms for estimating the Jones polynomial. As far as I know, no one has shown a quantum based estimate for the Jones polynomial which is better than a classical algorithm.

I would also say that among quantum computing complexity theorists there IS clarity about this issue. Also note that the current “faster than classical” quantum algorithms don’t work in this manner (expectation value of an observable.) Instead, they work like more like a probabilistic classical computer. Note that there is a huge difference between an analog computer (whose power scales poorly with precision) and a probabilistic computer (whose power scales nicely with the precision of the probabilities.) In fact, this simple difference is, I think, what causes many physicists to disregard quantum computers as “merely” analog machines. They’re not, they’re more similar to probabilistic machines, about which, computer scientists and reasonable people all agree, there is very little problem calling them true realisic computers.

Another thing to note is that for a large class of TQFTs Freedman, Kitaev, and Zhang have shown that these TQFTs are equal in power to a traditional quantum computer (see, for example quant-ph/0001071). For some TQFTs it is also true that they aren’t even as powerful as a quantum computer. But so far, I know of no candidates for TQFTs which are more powerful than the traditional quantum circuit model. The main reason one might hope that the models are more powerful is that TQFTs don’t necessarily have a built in locality structure. What Freedman, Kitaev, and Zhang showed was that for a large class of models, there is indeed a notion of locality hidden in the particular TQFTs they examine.