Results 1  10
of
65
Holographic Algorithms: From Art to Science
 Electronic Colloquium on Computational Complexity Report
, 2007
"... We develop the theory of holographic algorithms. We give characterizations of algebraic varieties of realizable symmetric generators and recognizers on the basis manifold, and a polynomial time decision algorithm for the simultaneous realizability problem. Using the general machinery we are able to ..."
Abstract

Cited by 41 (16 self)
 Add to MetaCart
We develop the theory of holographic algorithms. We give characterizations of algebraic varieties of realizable symmetric generators and recognizers on the basis manifold, and a polynomial time decision algorithm for the simultaneous realizability problem. Using the general machinery we are able to give unexpected holographic algorithms for some counting problems, modulo certain Mersenne type integers. These counting problems are #Pcomplete without the moduli. Going beyond symmetric signatures, we define dadmissibility and drealizability for general signatures, and give a characterization of 2admissibility and some general constructions of admissible and realizable families. 1
The Computational Complexity of Linear Optics
 in Proceedings of STOC 2011
"... We give new evidence that quantum computers—moreover, rudimentary quantumcomputers built entirely out of linearoptical elements—cannotbeefficientlysimulatedbyclassical computers. In particular, we define a model of computation in which identical photons are generated, sent through a linearoptical n ..."
Abstract

Cited by 32 (8 self)
 Add to MetaCart
(Show Context)
We give new evidence that quantum computers—moreover, rudimentary quantumcomputers built entirely out of linearoptical elements—cannotbeefficientlysimulatedbyclassical computers. In particular, we define a model of computation in which identical photons are generated, sent through a linearoptical network, then nonadaptively measured to count the number of photons in each mode. This model is not known or believed to be universal for quantum computation, and indeed, we discuss the prospects for realizing the model using current technology. On the other hand, we prove that the model is able to solve sampling problems and search problems that are classically intractable under plausible assumptions. Our first result says that, if there exists a polynomialtime classical algorithm that samples from the same probability distribution as a linearoptical network, then P #P = BPP NP, and hence the polynomial hierarchy collapses to the third level. Unfortunately, this result assumes an extremely accurate simulation. Our main result suggests that even an approximate or noisy classical simulation would already imply a collapse of the polynomial hierarchy. For this, we need two unproven conjectures: the PermanentofGaussians Conjecture, which says that it is #Phard to approximate the permanent of a matrixAofindependentN (0,1)Gaussianentries, withhigh probability over A; and the Permanent AntiConcentration Conjecture, which says that Per(A)  ≥ √ n!/poly(n) with high probability over A. We present evidence for these conjectures, both of which seem interesting even apart from our application. For the 96page full version, see www.scottaaronson.com/papers/optics.pdf
Multilinear Formulas and Skepticism of Quantum Computing
 In Proc. ACM STOC
, 2004
"... Several researchers, including Leonid Levin, Gerard 't Hooft, and Stephen Wolfram, have argued that quantum mechanics will break down before the factoring of large numbers becomes possible. If this is true, then there should be a natural "Sure/Shor separator"that is, a set of qua ..."
Abstract

Cited by 31 (8 self)
 Add to MetaCart
Several researchers, including Leonid Levin, Gerard 't Hooft, and Stephen Wolfram, have argued that quantum mechanics will break down before the factoring of large numbers becomes possible. If this is true, then there should be a natural "Sure/Shor separator"that is, a set of quantum states that can account for all experiments performed to date, but not for Shor's factoring algorithm. We propose as a candidate the set of states expressible by a polynomial number of additions and tensor products. Using a recent lower bound on multilinear formula size due to Raz, we then show that states arising in quantum errorcorrection require n## additions and tensor products even to approximate, which incidentally yields the first superpolynomial gap between general and multilinear formula size of functions. More broadly, we introduce a complexity classification of pure quantum states, and prove many basic facts about this classification. Our goal is to refine vague ideas about a breakdown of quantum mechanics into specific hypotheses that might be experimentally testable in the near future.
Simulating quantum computation by contracting tensor networks
 SIAM Journal on Computing
, 2005
"... The treewidth of a graph is a useful combinatorial measure of how close the graph is to a tree. We prove that a quantum circuit with T gates whose underlying graph has treewidth d can be simulated deterministically in T O(1) exp[O(d)] time, which, in particular, is polynomial in T if d = O(logT). Am ..."
Abstract

Cited by 30 (1 self)
 Add to MetaCart
(Show Context)
The treewidth of a graph is a useful combinatorial measure of how close the graph is to a tree. We prove that a quantum circuit with T gates whose underlying graph has treewidth d can be simulated deterministically in T O(1) exp[O(d)] time, which, in particular, is polynomial in T if d = O(logT). Among many implications, we show efficient simulations for quantum formulas, defined and studied by Yao (Proceedings of the 34th Annual Symposium on Foundations of Computer Science, 352–361, 1993), and for logdepth circuits whose gates apply to nearby qubits only, a natural constraint satisfied by most physical implementations. We also show that oneway quantum computation of Raussendorf and Briegel (Physical Review Letters, 86:5188– 5191, 2001), a universal quantum computation scheme with promising physical implementations, can be efficiently simulated by a randomized algorithm if its quantum resource is derived from a smalltreewidth graph.
A Quantum Logic Array Microarchitecture: Scalable Quantum Data Movement and Computation
 Proceedings of the 38th International Symposium on Microarchitecture MICRO38
, 2005
"... Recent experimental advances have demonstrated technologies capable of supporting scalable quantum computation. A critical next step is how to put those technologies together into a scalable, faulttolerant system that is also feasible. We propose a Quantum Logic Array (QLA) microarchitecture that f ..."
Abstract

Cited by 28 (3 self)
 Add to MetaCart
(Show Context)
Recent experimental advances have demonstrated technologies capable of supporting scalable quantum computation. A critical next step is how to put those technologies together into a scalable, faulttolerant system that is also feasible. We propose a Quantum Logic Array (QLA) microarchitecture that forms the foundation of such a system. The QLA focuses on the communication resources necessary to efficiently support faulttolerant computations. We leverage the extensive groundwork in quantum error correction theory and provide analysis that shows that our system is both asymptotically and empirically fault tolerant. Specifically, we use the QLA to implement a hierarchical, arraybased design and a logarithmic expense quantumteleportation communication protocol. Our goal is to overcome the primary scalability challenges of reliability, communication, and quantum resource distribution that plague current proposals for largescale quantum computing. Our work complements recent work by Balenseifer et al [1], which studies the software tool chain necessary to simplify development of quantum applications; here we focus on modeling a fullscale optimized microarchitecture for scalable computing. 1.
The learnability of quantum states
 quantph/0608142
, 2006
"... Traditional quantum state tomography requires a number of measurements that grows exponentially with the number of qubits n. But using ideas from computational learning theory, we show that “for most practical purposes ” one can learn a state using a number of measurements that grows only linearly w ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
(Show Context)
Traditional quantum state tomography requires a number of measurements that grows exponentially with the number of qubits n. But using ideas from computational learning theory, we show that “for most practical purposes ” one can learn a state using a number of measurements that grows only linearly with n. Besides possible implications for experimental physics, our learning theorem has two applications to quantum computing: first, a new simulation of quantum oneway communication protocols, and second, the use of trusted classical advice to verify untrusted quantum advice. 1
Toward a software architecture for quantum computing design tools
 Proceedings of the 2nd International Workshop on Quantum Programming Languages (QPL
, 2004
"... Compilers and computeraided design tools are essential for finegrained control of nanoscale quantummechanical systems. A proposed fourphase design flow assists with computations by transforming a quantum algorithm from a highlevel language program into precisely scheduled physical actions. Quan ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
Compilers and computeraided design tools are essential for finegrained control of nanoscale quantummechanical systems. A proposed fourphase design flow assists with computations by transforming a quantum algorithm from a highlevel language program into precisely scheduled physical actions. Quantum computers have the potential to solve certain computational problems—for example, factoring composite numbers or comparing an unknown image against a large database— more efficiently than modern computers. They are also indispensable in controlling quantummechanical systems in emergent nanotechnology applications, such as secure optical communication, in which modern computers cannot natively operate on quantum data. Despite convincing laboratory demonstrations of
Quantum CopyProtection and Quantum Money
"... Forty years ago, Wiesner proposed using quantum states to create money that is physically impossible to counterfeit, something that cannot be done in the classical world. However, Wiesner’s scheme required a central bank to verify the money, and the question of whether there can be unclonable quantu ..."
Abstract

Cited by 20 (5 self)
 Add to MetaCart
(Show Context)
Forty years ago, Wiesner proposed using quantum states to create money that is physically impossible to counterfeit, something that cannot be done in the classical world. However, Wiesner’s scheme required a central bank to verify the money, and the question of whether there can be unclonable quantum money that anyone can verify has remained open since. One can also ask a related question, which seems to be new: can quantum states be used as copyprotected programs, which let the user evaluate some function f, but not create more programs for f? This paper tackles both questions using the arsenal of modern computational complexity. Our main result is that there exist quantum oracles relative to which publiclyverifiable quantum money is possible, and any family of functions that cannot be efficiently learned from its inputoutput behavior can be quantumly copyprotected. This provides the first formal evidence that these tasks are achievable. The technical core of our result is a “ComplexityTheoretic NoCloning Theorem,” which generalizes both the standard NoCloning Theorem and the optimality of Grover search, and might be of independent interest. Our security argument also requires explicit constructions of quantum tdesigns. Moving beyond the oracle world, we also present an explicit candidate scheme for publiclyverifiable quantum money, based on random stabilizer states; as well as two explicit schemes for copyprotecting the family of point functions. We do not know how to base the security of these schemes on any existing cryptographic assumption. (Note that without an oracle, we can only hope for security under some computational assumption.)
Probabilistic model–checking of quantum protocols
 DCM 2006: PROCEEDINGS OF THE 2ND INTERNATIONAL WORKSHOP ON DEVELOPMENTS IN COMPUTATIONAL MODELS
, 2005
"... We establish fundamental and general techniques for formal verification of quantum protocols. Quantum protocols are novel communication schemes involving the use of quantummechanical phenomena for representation, storage and transmission of data. As opposed to quantum computers, quantum communicati ..."
Abstract

Cited by 17 (6 self)
 Add to MetaCart
(Show Context)
We establish fundamental and general techniques for formal verification of quantum protocols. Quantum protocols are novel communication schemes involving the use of quantummechanical phenomena for representation, storage and transmission of data. As opposed to quantum computers, quantum communication systems can and have been implemented using presentday technology; therefore, the ability to model and analyse such systems rigorously is of primary importance. While current analyses of quantum protocols use a traditional mathematical approach and require considerable understanding of the underlying physics, we argue that automated verification techniques provide an elegant alternative. We demonstrate these techniques through the use of prism, a probabilistic modelchecking tool. Our approach is conceptually simpler than existing proofs, and allows us to disambiguate protocol definitions and assess their properties. It also facilitates detailed analyses of actual implemented systems. We illustrate our techniques by modelling a selection of quantum protocols (namely superdense coding, quantum teleportation, and quantum error correction) and verifying their basic correctness properties. Our results provide a foundation for further work on modelling and analysing larger systems such as those used for quantum cryptography, in which basic protocols are used as components.
Checking Equivalence of Quantum Circuits and States
, 2007
"... Quantum computing promises exponential speedups for important simulation and optimization problems. It also poses new CAD problems that are similar to, but more challenging, than the related problems in classical (nonquantum) CAD, such as determining if two states or circuits are functionally equiv ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
(Show Context)
Quantum computing promises exponential speedups for important simulation and optimization problems. It also poses new CAD problems that are similar to, but more challenging, than the related problems in classical (nonquantum) CAD, such as determining if two states or circuits are functionally equivalent. While differences in classical states are easy to detect, quantum states, which are represented by complexvalued vectors, exhibit subtle differences leading to several notions of equivalence. This provides flexibility in optimizing quantum circuits, but leads to difficult new equivalencechecking issues for simulation and synthesis. We identify several different equivalencechecking problems and present algorithms for practical benchmarks, including quantum communication and search circuits, which are shown to be very fast and robust for hundreds of qubits.