Results 1  10
of
24
Oracle quantum computing
 Brassard & U.Vazirani, Strengths and weaknesses of quantum computing
, 1994
"... \Because nature isn't classical, dammit..." ..."
Abstract

Cited by 115 (8 self)
 Add to MetaCart
\Because nature isn't classical, dammit..."
ComplexityTheoretic Aspects of Interactive Proof Systems
, 1989
"... In 1985, Goldwasser, Micali and Rackoff formulated interactive proof systems as a tool for developing cryptographic protocols. Indeed, many exciting cryptographic results followed from studying interactive proof systems and the related concept of zeroknowledge. Interactive proof systems also have a ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
In 1985, Goldwasser, Micali and Rackoff formulated interactive proof systems as a tool for developing cryptographic protocols. Indeed, many exciting cryptographic results followed from studying interactive proof systems and the related concept of zeroknowledge. Interactive proof systems also have an important part in complexity theory merging the well established concepts of probabilistic and nondeterministic computation. This thesis will study the complexity of various models of interactive proof systems. A perfect zeroknowledge interactive protocol convinces a verifier that a string is in a language without revealing any additional knowledge in an information theoretic sense. This thesis will show that for any language that has a perfect zeroknowledge proof system, its complement has a short interactive protocol. This result implies that there are not any perfect zeroknowledge protocols for NPcomplete languages unless the polynomialtime hierarchy collapses. Thus knowledge comp...
A superpolynomial lower bound for regular arithmetic formulas.
 In Proc. 46th Annual ACM Symposium on the Theory of Computing,
, 2014
"... Abstract We consider arithmetic formulas consisting of alternating layers of addition (+) and multiplication (×) gates such that the fanin of all the gates in any fixed layer is the same. Such a formula Φ which additionally has the property that its formal/syntactic degree is at most twice the (tot ..."
Abstract

Cited by 15 (6 self)
 Add to MetaCart
(Show Context)
Abstract We consider arithmetic formulas consisting of alternating layers of addition (+) and multiplication (×) gates such that the fanin of all the gates in any fixed layer is the same. Such a formula Φ which additionally has the property that its formal/syntactic degree is at most twice the (total) degree of its output polynomial, we refer to as a regular formula. As usual, we allow arbitrary constants from the underlying field F on the incoming edges to a + gate so that a + gate can in fact compute an arbitrary Flinear combination of its inputs. We show that there is an (n 2 + 1)variate polynomial of degree 2n in VNP such that any regular formula computing it must be of size at least n Ω(log n) . Along the way, we examine depth four (ΣΠΣΠ) regular formulas wherein all multiplication gates in the layer adjacent to the inputs have fanin a and all multiplication gates in the layer adjacent to the output node have fanin b. We refer to such formulas as ΣΠ [b] ΣΠ [a] formulas. We show that there exists an n 2 variate polynomial of degree n in VNP such that any ΣΠ
ComplexityRestricted Advice Functions
"... . We consider uniform subclasses of the nonuniform complexity classes defined by Karp and Lipton [23] via the notion of advice functions. These subclasses are obtained by restricting the complexity of computing correct advice. We also investigate the effect of allowing advice functions of limited co ..."
Abstract

Cited by 13 (4 self)
 Add to MetaCart
. We consider uniform subclasses of the nonuniform complexity classes defined by Karp and Lipton [23] via the notion of advice functions. These subclasses are obtained by restricting the complexity of computing correct advice. We also investigate the effect of allowing advice functions of limited complexity to depend on the input rather than on the input's length. Among other results, using the notions described above, we give new characterizations of (a) NP NP"SPARSE , (b) NP with a restricted access to an NP oracle and (c) the odd levels of the boolean hierarchy. As a consequence, we show that every set that is nondeterministically truthtable reducible to SAT in the sense of Rich [35] is already deterministically truthtable reducible to SAT. Furthermore, it turns out that the NP reduction classes of bounded versions of this reducibility coincide with the odd levels of the boolean hierarchy. Key words. nonuniform complexity classes, advice classes, optimization functions, restric...
Nonmonotonic reasoning with quantified Boolean constraints
 In Proceedings of the 4th International Conference on Logic Programming and Nonmonotonic Reasoning (LPNMR97), number 1265 in LNCS
, 1997
"... Abstract. In this paper, we define and investigate the complexity of several nonmonotonic logics with quantified Boolean formulas as constraints. We give quantified constraint versions of the constraint programming formalism of Marek, Nerode, and Remmel [15] and of the natural extension of their the ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper, we define and investigate the complexity of several nonmonotonic logics with quantified Boolean formulas as constraints. We give quantified constraint versions of the constraint programming formalism of Marek, Nerode, and Remmel [15] and of the natural extension of their theory to default logic. We also introduce a new formalism which adds constraints to circumscription. We show that standard complexity results for each of these formalisms generalize in the quantified constraint case. Gogic, Kautz, Papadimitriou, and Selman [8] have introduced a new method for measuring the strengths of reasoning formalisms based on succinctness of model representation. We show a natural hierarchy based on this measure exists between our versions of logic programming, circumscription, and default logic. Finally, we discuss some results about the relative succinctness of our reasoning formalisms versus any formalism for which model checking can be done somewhere in the polynomial time hierarchy. 1
ON COMPUTATIONAL INTRACTABILITY ASSUMPTIONS IN CRYPTOGRAPHY
, 2011
"... In cryptographic protocols, honest parties would prefer that their security is assured even in presence of adversarial parties who have unbounded computational power. Information theoretic secure realization of cryptographic primitives provides such guarantees; but for most tasks such strong securit ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
In cryptographic protocols, honest parties would prefer that their security is assured even in presence of adversarial parties who have unbounded computational power. Information theoretic secure realization of cryptographic primitives provides such guarantees; but for most tasks such strong security guarantees cannot be provided for any reasonable notion of security. The standard technique used in cryptography is to assume the existence of some puzzle, whose hard instances are easy to generate but no efficient algorithm can solve them. Such assumptions, which define intractable problems for efficient algorithms, are called computational intractability assumptions. In this work, we motivate a study of computational intractability assumptions which is goal driven and lends support to the fundamental nature of some of the traditional assumptions beyond being historical accidents. Secure multiparty computation deals with the study of constructing secure protocols for general cryptographic tasks which conform to various notions of security. Inspired by complexity theory, we use the notion of reduction to further our understanding of computational intractability assumptions. Our framework explores the hardness of natural assumptions of the form: “Task F can be securely computed given an ideally secure facility for computing G”.
AverageCase Complexity Theory and PolynomialTime Reductions
, 2001
"... This thesis studies averagecase complexity theory and polynomialtime reducibilities. The issues in averagecase complexity arise primarily from Cai and Selman's extension of Levin's denition of average polynomial time. We study polynomialtime reductions between distributional problems. ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This thesis studies averagecase complexity theory and polynomialtime reducibilities. The issues in averagecase complexity arise primarily from Cai and Selman's extension of Levin's denition of average polynomial time. We study polynomialtime reductions between distributional problems. Under strong but reasonable hypotheses we separate ordinary NPcompleteness notions.
Limits of Constructive Security Proofs
, 2008
"... Abstract. The collisionresistance of hash functions is an important foundation of many cryptographic protocols. Formally, collisionresistance can only be expected if the hash function in fact constitutes a parametrized family of functions, since for a single function, the adversary could simply kn ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract. The collisionresistance of hash functions is an important foundation of many cryptographic protocols. Formally, collisionresistance can only be expected if the hash function in fact constitutes a parametrized family of functions, since for a single function, the adversary could simply know a single hardcoded collision. In practical applications, however, unkeyed hash functions are a common choice, creating a gap between the practical application and the formal proof, and, even more importantly, the concise mathematical definitions. A pragmatic way out of this dilemma was recently formalized by Rogaway: instead of requiring that no adversary exists that breaks the protocol (existential security), one requires that given an adversary that breaks the protocol, we can efficiently construct a collision of the hash function using an explicitly given reduction (constructive security). In this paper, we show the limits of this approach: We give a protocol that is existentially secure, but that provably cannot be proven secure using a constructive security proof. Consequently, constructive security—albeit constituting a useful improvement over the state of the art—is not comprehensive enough to encompass all protocols that can be dealt with using existential security proofs. 1