Results 1  10
of
24
Lower bounds on the Efficiency of Generic Cryptographic Constructions
 41ST IEEE SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE (FOCS), IEEE
, 2000
"... A central focus of modern cryptography is the construction of efficient, “highlevel” cryptographic tools (e.g., encryption schemes) from weaker, “lowlevel ” cryptographic primitives (e.g., oneway functions). Of interest are both the existence of such constructions, and their efficiency. Here, we ..."
Abstract

Cited by 82 (6 self)
 Add to MetaCart
(Show Context)
A central focus of modern cryptography is the construction of efficient, “highlevel” cryptographic tools (e.g., encryption schemes) from weaker, “lowlevel ” cryptographic primitives (e.g., oneway functions). Of interest are both the existence of such constructions, and their efficiency. Here, we show essentiallytight lower bounds on the best possible efficiency of any blackbox construction of some fundamental cryptographic tools from the most basic and widelyused cryptographic primitives. Our results hold in an extension of the model introduced by Impagliazzo and Rudich, and improve and extend earlier results of Kim, Simon, and Tetali. We focus on constructions of pseudorandom generators, universal oneway hash functions, and digital signatures based on oneway permutations, as well as constructions of public and privatekey encryption schemes based on trapdoor permutations. In each case, we show that any blackbox construction beating our efficiency bound would yield the unconditional existence of a oneway function and thus, in particular, prove P != NP.
Finding collisions in interactive protocols – A tight lower bound on the round complexity of statisticallyhiding commitments
 In Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science
, 2007
"... We study the round complexity of various cryptographic protocols. Our main result is a tight lower bound on the round complexity of any fullyblackbox construction of a statisticallyhiding commitment scheme from oneway permutations, and even from trapdoor permutations. This lower bound matches th ..."
Abstract

Cited by 42 (13 self)
 Add to MetaCart
(Show Context)
We study the round complexity of various cryptographic protocols. Our main result is a tight lower bound on the round complexity of any fullyblackbox construction of a statisticallyhiding commitment scheme from oneway permutations, and even from trapdoor permutations. This lower bound matches the round complexity of the statisticallyhiding commitment scheme due to Naor, Ostrovsky, Venkatesan and Yung (CRYPTO ’92). As a corollary, we derive similar tight lower bounds for several other cryptographic protocols, such as singleserver private information retrieval, interactive hashing, and oblivious transfer that guarantees statistical security for one of the parties. Our techniques extend the collisionfinding oracle due to Simon (EUROCRYPT ’98) to the setting of interactive protocols (our extension also implies an alternative proof for the main property of the original oracle). In addition, we substantially extend the reconstruction paradigm of Gennaro and Trevisan (FOCS ‘00). In both cases, our extensions are quite delicate and may be found useful in proving additional blackbox separation results.
Reducing complexity assumptions for statisticallyhiding commitment
 In EUROCRYPT
, 2005
"... We revisit the following question: what are the minimal assumptions needed to construct statisticallyhiding commitment schemes? Naor et al. show how to construct such schemes based on any oneway permutation. We improve upon this by showing a construction based on any approximable preimagesize one ..."
Abstract

Cited by 36 (8 self)
 Add to MetaCart
(Show Context)
We revisit the following question: what are the minimal assumptions needed to construct statisticallyhiding commitment schemes? Naor et al. show how to construct such schemes based on any oneway permutation. We improve upon this by showing a construction based on any approximable preimagesize oneway function. These are oneway functions for which it is possible to efficiently approximate the number of preimages of a given output. A special case is the class of regular oneway functions where all points in the image of the function have the same number of preimages. We also prove two additional results related to statisticallyhiding commitment. First, we prove a (folklore) parallel composition theorem showing, roughly speaking, that the statistical hiding property of any such commitment scheme is amplified exponentially when multiple independent parallel executions of the scheme are carried out. Second, we show a compiler which transforms any commitment scheme which is statistically hiding against an honestbutcurious receiver into one which is statistically hiding even against a malicious receiver. 1
Sufficient Conditions for CollisionResistant Hashing
 In Proceedings of the 2nd Theory of Cryptography Conference
, 2005
"... Abstract. We present several new constructions of collisionresistant hashfunctions (CRHFs) from general assumptions. We start with a simple construction of CRHF from any homomorphic encryption. Then, we strengthen this result by presenting constructions of CRHF from two other primitives that are i ..."
Abstract

Cited by 19 (2 self)
 Add to MetaCart
(Show Context)
Abstract. We present several new constructions of collisionresistant hashfunctions (CRHFs) from general assumptions. We start with a simple construction of CRHF from any homomorphic encryption. Then, we strengthen this result by presenting constructions of CRHF from two other primitives that are implied by homomorphicencryption: oneround private information retrieval (PIR) protocols and homomorphic oneway commitments. Keywords. Collisionresistant hash functions, homomorphic encryption, private informationretrieval. 1 Introduction Collision resistant hashfunctions (CRHFs) are an important cryptographic primitive. Their applications range from classic ones such as the &quot;hashandsign &quot; paradigm for signatures, via efficient (zeroknowledge) arguments [14, 17, 2], tomore recent applications such as ones relying on the nonblackbox techniques of [1]. In light of the importance of the CRHF primitive, it is natural to study itsrelations with other primitives and try to construct it from the most general
On robust combiners for private information retrieval and other primitives
 CRYPTO
, 2006
"... Abstract. Let A and B denote cryptographic primitives. A (k, m)robust AtoB combiner is a construction, which takes m implementations of primitive A as input, and yields an implementation of primitive B, which is guaranteed to be secure as long as at least k input implementations are secure. The ma ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Let A and B denote cryptographic primitives. A (k, m)robust AtoB combiner is a construction, which takes m implementations of primitive A as input, and yields an implementation of primitive B, which is guaranteed to be secure as long as at least k input implementations are secure. The main motivation for such constructions is the tolerance against wrong assumptions on which the security of implementations is based. For example, a (1,2)robust AtoB combiner yields a secure implementation of B even if an assumption underlying one of the input implementations of A turns out to be wrong. In this work we study robust combiners for private information retrieval (PIR), oblivious transfer (OT), and bit commitment (BC). We propose a (1,2)robust PIRtoPIR combiner, and describe various optimizations based on properties of existing PIR protocols. The existence of simple PIRtoPIR combiners is somewhat surprising, since OT, a very closely related primitive, seems difficult to combine (Harnik et al., Eurocrypt’05). Furthermore, we present (1,2)robust PIRtoOT and PIRtoBC combiners. To the best of our knowledge these are the first constructions of AtoB combiners with A � = B. Such combiners, in addition to being interesting in their own right, offer insights into relationships between cryptographic primitives. In particular, our PIRtoOT combiner together with the impossibility result for OTcombiners of Harnik et al. rule out certain types of reductions of PIR to OT. Finally, we suggest a more finegrained approach to construction of robust combiners, which may lead to more efficient and practical combiners in many scenarios.
OneWay Permutations, Interactive Hashing and StatisticallyHiding Commitments
 In S. Vadhan (Ed.): Theory of Cryptography (TCC) 2007, LNCS 4392
, 2007
"... Abstract. We present a lower bound on the round complexity of a natural class of blackbox constructions of statistically hiding commitments from oneway permutations. This implies a Ω ( n logn) lower bound on the round complexity of a computational form of interactive hashing, which has been used t ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We present a lower bound on the round complexity of a natural class of blackbox constructions of statistically hiding commitments from oneway permutations. This implies a Ω ( n logn) lower bound on the round complexity of a computational form of interactive hashing, which has been used to construct statistically hiding commitments (and related primitives) from various classes of oneway functions, starting with the work of Naor, Ostrovsky, Venkatesan and Yung (J. Cryptology, 1998). Our lower bound matches the round complexity of the protocol studied by Naor et al.
Bounds on the Efficiency of Encryption and Digital Signatures
, 2002
"... A central focus of modern cryptography is to investigate the weakest possible assumptions under which various cryptographic algorithms exist. Typically, a proof that a "weak" primitive (e.g., a oneway function) implies the existence of some "strong" algorithm (e.g., a privateke ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
(Show Context)
A central focus of modern cryptography is to investigate the weakest possible assumptions under which various cryptographic algorithms exist. Typically, a proof that a "weak" primitive (e.g., a oneway function) implies the existence of some "strong" algorithm (e.g., a privatekey encryption scheme) proceeds by giving an explicit construction of the latter from the former. Beyond merely showing such a construction, an equally important research direction is to explore the efficiency of the construction. One might argue that this line of research has become even more important now that minimal assumptions are known for many (but not all) algorithms of interest.
Bounds on the efficiency of “blackbox” commitment schemes
 32nd ICALP
, 2005
"... Constructions of cryptographic primitives based on general assumptions (e.g., oneway functions) tend to be less efficient than constructions based on specific (e.g., numbertheoretic) assumptions. This has prompted a recent line of research aimed at investigating the best possible efficiency of (bl ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
Constructions of cryptographic primitives based on general assumptions (e.g., oneway functions) tend to be less efficient than constructions based on specific (e.g., numbertheoretic) assumptions. This has prompted a recent line of research aimed at investigating the best possible efficiency of (blackbox) cryptographic constructions based on general assumptions. Here, we present bounds on the efficiency of statisticallybinding commitment schemes constructed using blackbox access to oneway permutations; our bounds are tight for the case of perfectlybinding schemes. Our bounds hold in an extension of the ImpagliazzoRudich model: we show that any construction beating our bounds would imply the unconditional existence of a oneway function (from which a statisticallybinding commitment scheme could be constructed “from scratch”). Key words: Cryptography, commitment schemes
Honest verifier zeroknowledge arguments applied
 DISSERTATION SERIES DS043, BRICS, 2004. PHD THESIS. XII+119
, 2004
"... ..."
Adaptive ZeroKnowledge Proofs and Adaptively Secure Oblivious Transfer
, 2009
"... In the setting of secure computation, a set of parties wish to securely compute some function of their inputs, in the presence of an adversary. The adversary in question may be static (meaning that it controls a predetermined subset of the parties) or adaptive (meaning that it can choose to corrupt ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
(Show Context)
In the setting of secure computation, a set of parties wish to securely compute some function of their inputs, in the presence of an adversary. The adversary in question may be static (meaning that it controls a predetermined subset of the parties) or adaptive (meaning that it can choose to corrupt parties during the protocol execution and based on what it sees). In this paper, we study two fundamental questions relating to the basic zeroknowledge and oblivious transfer protocol problems: • Adaptive zeroknowledge proofs: We ask whether it is possible to construct adaptive zeroknowledge proofs (with unconditional soundness). Beaver (STOC 1996) showed that known zeroknowledge proofs are not adaptively secure, and in addition showed how to construct zeroknowledge arguments (with computational soundness). • Adaptively secure oblivious transfer: All known protocols for adaptively secure oblivious transfer rely on seemingly stronger hardness assumptions than for the case of static adversaries. We ask whether this is inherent, and in particular, whether it is possible to construct adaptively secure oblivious transfer from enhanced trapdoor permutations alone.