Results 1  10
of
22
From extractable collision resistance to succinct noninteractive arguments of knowledge, and back again
 In Proceedings of the 3rd Innovations in Theoretical Computer Science Conference, ITCS '12
, 2012
"... The existence of noninteractive succinct arguments (namely, noninteractive computationallysound proof systems where the verifier’s time complexity is only polylogarithmically related to the complexity of deciding the language) has been an intriguing question for the past two decades. The question ..."
Abstract

Cited by 63 (18 self)
 Add to MetaCart
The existence of noninteractive succinct arguments (namely, noninteractive computationallysound proof systems where the verifier’s time complexity is only polylogarithmically related to the complexity of deciding the language) has been an intriguing question for the past two decades. The question has gained renewed importance in light of the recent interest in delegating computation to untrusted workers. Still, other than Micali’s CS proofs in the Random Oracle Model, the only existing candidate construction is based on an elaborate assumption that is tailored to the specific proposal [Di Crescenzo and Lipmaa, CiE ’08]. We modify and reanalyze that construction: • We formulate a general and relatively mild notion of extractable collisionresistant hash functions (ECRHs), and show that if ECRHs exist then the modified construction is a noninteractive succinct argument (SNARG) for NP. Furthermore, we show that (a) this construction is a proof of knowledge, and (b) it remains secure against adaptively chosen instances. These two properties are arguably essential for using the construction as a delegation of computation scheme. • We show that existence of SNARGs of knowledge (SNARKs) for NP implies existence of ECRHs, as well as extractable variants of some other cryptographic primitives. This provides further evi
Making argument systems for outsourced computation practical (sometimes
 In NDSS
, 2012
"... This paper describes the design, implementation, and evaluation of a system for performing verifiable outsourced computation. It has long been known that (1) this problem can be solved in theory using probabilistically checkable proofs (PCPs) coupled with modern cryptographic tools, and (2) these ..."
Abstract

Cited by 38 (7 self)
 Add to MetaCart
(Show Context)
This paper describes the design, implementation, and evaluation of a system for performing verifiable outsourced computation. It has long been known that (1) this problem can be solved in theory using probabilistically checkable proofs (PCPs) coupled with modern cryptographic tools, and (2) these solutions have wholly impractical performance, according to the conventional (and wellfounded) wisdom. Our goal is to challenge (2), with a built system that implements an argument system based on PCPs. We describe a generalpurpose system that builds on work of Ishai et al. (CCC ’07) and incorporates new theoretical work to improve performance by 20 orders of magnitude. The system is (arguably) practical in some cases, suggesting that, as a tool for building secure systems, PCPs are not a lost cause. 1
Toward practical and unconditional verification of remote computations
"... This paper revisits a classic question: how can a machine specify a computation to another one and then, without executing the computation, check that the other machine carried it out correctly? The applications of such a primitive ..."
Abstract

Cited by 23 (8 self)
 Add to MetaCart
(Show Context)
This paper revisits a classic question: how can a machine specify a computation to another one and then, without executing the computation, check that the other machine carried it out correctly? The applications of such a primitive
Targeted malleability: Homomorphic encryption for restricted computations
, 2011
"... We put forward the notion of targeted malleability: given a homomorphic encryption scheme, in various scenarios we would like to restrict the homomorphic computations one can perform on encrypted data. We introduce a precise framework, generalizing the foundational notion of nonmalleability introdu ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
We put forward the notion of targeted malleability: given a homomorphic encryption scheme, in various scenarios we would like to restrict the homomorphic computations one can perform on encrypted data. We introduce a precise framework, generalizing the foundational notion of nonmalleability introduced by Dolev, Dwork, and Naor (SICOMP ’00), ensuring that the malleability of a scheme is targeted only at a specific set of “allowable ” functions. In this setting we are mainly interested in the efficiency of such schemes as a function of the number of repeated homomorphic operations. Whereas constructing a scheme whose ciphertext grows linearly with the number of such operations is straightforward, obtaining more realistic (or merely nontrivial) length guarantees is significantly more challenging. We present two constructions that transform any homomorphic encryption scheme into one that offers targeted malleability. Our constructions rely on standard cryptographic tools and on succinct noninteractive arguments, which are currently known to exist in the standard model based on variants of the knowledgeofexponent assumption. The two constructions offer somewhat different efficiency guarantees, each of which may be preferable depending on the underlying building blocks. Keywords: Homomorphic encryption, Nonmalleable encryption.
Compiling computations to constraints for verified computation. UT Austin Honors thesis HR1210
, 2012
"... Abstract. We present a compiler that automates the task of converting highlevel code to constraint sets of the form accepted by the Ginger and Zaatar protocols for verified computation. Performing the conversion from highlevel code to constraints by hand is prone to human error and therefore not p ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
Abstract. We present a compiler that automates the task of converting highlevel code to constraint sets of the form accepted by the Ginger and Zaatar protocols for verified computation. Performing the conversion from highlevel code to constraints by hand is prone to human error and therefore not practical for large computations. This paper evaluates the performance of the compiler and the effectiveness of its optimizations on reducing the size of the constraint set. We show that the compiler can produce constraint sets for a number of interesting computations, including DNA sequence alignment of 200 nucleotide sequences and partitionaboutmedoids clustering of 100dimensional data into two clusters. 1
Enforcing Language Semantics Using ProofCarrying Data (extended version)
, 2013
"... The soundness of languagelevel reasoning about programs relies on program execution adhering to the language semantics. However, in a distributed computation, when a value is sent from one party to another, the receiver faces the question of whether the value is welltraced, i.e., could it have pro ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
The soundness of languagelevel reasoning about programs relies on program execution adhering to the language semantics. However, in a distributed computation, when a value is sent from one party to another, the receiver faces the question of whether the value is welltraced, i.e., could it have produced by a computation that respects the language semantics? Otherwise, accepting the value may lead to bugs or vulnerabilities. ProofCarrying Data (PCD) is a recentlyintroduced cryptographic mechanism that allows messages in a distributed computation to be accompanied by proof that the message and the history leading to it complies with a specified predicate. Using PCD, a verifier can be convinced that the predicate held throughout the distributed computation, even in the presence of malicious parties, and at a verification cost that is independent of the size of the computation producing the value. With a suitable choice of predicate, a program may use PCD to check that values received from the network are welltraced. Unfortunately, previous approaches to using PCD required tailoring a specialized predicate for each application, using an inconvenient formalism and with little methodological support. This work introduces a novel, PCDbased approach to enforcing language semantics in a distributed
unknown title
, 2015
"... My research makes it easier to build computer systems that handle sensitive information correctly. Society increasingly relies on computer systems that handle sensitive information: government, businesses, military, and private individuals all use computer systems that manipulate confidential and/ ..."
Abstract
 Add to MetaCart
(Show Context)
My research makes it easier to build computer systems that handle sensitive information correctly. Society increasingly relies on computer systems that handle sensitive information: government, businesses, military, and private individuals all use computer systems that manipulate confidential and/or untrusted data. For example: health management systems with confidential patient records, web applications with data supplied by potentially malicious users, and mobile devices with private user data. Sensitive information—both confidential and untrusted—must be treated carefully: confidential information must not be inappropriately released, and the use of untrusted information must not corrupt trusted computation. However, “handling sensitive information correctly ” depends in large part on the specifics of the application. For example, the restrictions on who may learn about a patient’s medical records are vastly different from who may learn about the financial records of a privatelyheld company. Thus, key challenges in this area are to develop general techniques and tools to: (1) define applicationspecific security guarantees; (2) express and understand the security requirements of applications; and (3) enforce applicationspecific security requirements. My research goal is to develop tools and techniques that allow developers of computer sys
Certified by..................................................................
, 2013
"... We present a proof system that allows efficient verification of NP statements, given proofs produced by an untrusted yet computationallybounded prover. Our system is publicly verifiable: after a trusted thirdparty has generated a proving key and a verification key, anyone can use the proving key t ..."
Abstract
 Add to MetaCart
(Show Context)
We present a proof system that allows efficient verification of NP statements, given proofs produced by an untrusted yet computationallybounded prover. Our system is publicly verifiable: after a trusted thirdparty has generated a proving key and a verification key, anyone can use the proving key to generate noninteractive proofs for adaptivelychosen NP statements, and the proofs can be verified by anyone using the verification key. Moreover, our system is statistically zeroknowledge and the generated public parameters are reusable. The NPcomplete language we choose is the correct execution of programs on TinyRAM, a minimalistic (nondeterministic) randomaccess machine that we design. Together with