Results 1  10
of
31
Finding collisions in interactive protocols – A tight lower bound on the round complexity of statisticallyhiding commitments
 In Proceedings of the 48th Annual IEEE Symposium on Foundations of Computer Science
, 2007
"... We study the round complexity of various cryptographic protocols. Our main result is a tight lower bound on the round complexity of any fullyblackbox construction of a statisticallyhiding commitment scheme from oneway permutations, and even from trapdoor permutations. This lower bound matches th ..."
Abstract

Cited by 42 (13 self)
 Add to MetaCart
(Show Context)
We study the round complexity of various cryptographic protocols. Our main result is a tight lower bound on the round complexity of any fullyblackbox construction of a statisticallyhiding commitment scheme from oneway permutations, and even from trapdoor permutations. This lower bound matches the round complexity of the statisticallyhiding commitment scheme due to Naor, Ostrovsky, Venkatesan and Yung (CRYPTO ’92). As a corollary, we derive similar tight lower bounds for several other cryptographic protocols, such as singleserver private information retrieval, interactive hashing, and oblivious transfer that guarantees statistical security for one of the parties. Our techniques extend the collisionfinding oracle due to Simon (EUROCRYPT ’98) to the setting of interactive protocols (our extension also implies an alternative proof for the main property of the original oracle). In addition, we substantially extend the reconstruction paradigm of Gennaro and Trevisan (FOCS ‘00). In both cases, our extensions are quite delicate and may be found useful in proving additional blackbox separation results.
On the Compressibility of NP Instances and Cryptographic Applications
"... We study compression that preserves the solution to an instance of a problem rather than preserving the instance itself. Our focus is on the compressibility of N P decision problems. We consider N P problems that have long instances but relatively short witnesses. The question is, can one efficientl ..."
Abstract

Cited by 38 (0 self)
 Add to MetaCart
(Show Context)
We study compression that preserves the solution to an instance of a problem rather than preserving the instance itself. Our focus is on the compressibility of N P decision problems. We consider N P problems that have long instances but relatively short witnesses. The question is, can one efficiently compress an instance and store a shorter representation that maintains the information of whether the original input is in the language or not. We want the length of the compressed instance to be polynomial in the length of the witness and polylog in the length of original input. We discuss the differences between this notion and similar notions from parameterized complexity. Such compression enables to succinctly store instances until a future setting will allow solving them, either via a technological or algorithmic breakthrough or simply until enough time has elapsed. We give a new classification of N P with respect to compression. This classification forms a stratification of N P that we call the VC hierarchy. The hierarchy is based on a new type of reduction called Wreduction and there are compressioncomplete problems for each class. Our motivation for studying this issue stems from the vast cryptographic implications compressibility has. For example, we say that SAT is compressible if there exists a polynomial p(·, ·) so that given a
On basing oneway functions on NPhardness
 In Proceedings of the ThirtyEighth Annual ACM Symposium on Theory of Computing
, 2006
"... We consider the question of whether it is possible to base the existence of oneway functions on N Phardness. That is we study the the possibility of reductions from a worstcase N Phard decision problem to the task of inverting a polynomial time computable function. We prove two negative results: ..."
Abstract

Cited by 34 (1 self)
 Add to MetaCart
We consider the question of whether it is possible to base the existence of oneway functions on N Phardness. That is we study the the possibility of reductions from a worstcase N Phard decision problem to the task of inverting a polynomial time computable function. We prove two negative results: 1. For any polynomial time computable function f: the existence of a randomized nonadaptive reduction of worst case N P problems to the task of averagecase inverting f implies that coN P ⊆ AM. It is widely believed that coN P is not contained in AM. Thus, this result may be regarded as showing that such reductions cannot exist (unless coN P ⊆ AM). This result improves previous negative results that placed coN P in nonuniform AM. 2. For any polynomial time computable function f for which it is possible to efficiently compute preimage sizes (i.e., f −1 (y)  for a given y): the existence of a randomized reduction of worst case N P problems to the task of inverting f implies that coN P ⊆ AM. Moreover, this is also true for functions for which it is possible to verify (via and AM protocol) the approximate size of preimage sizes (i.e., f −1 (y)  for a given y). These results holds for any reduction, including adaptive ones. The previously known negative results regarding worstcase to averagecase reductions were confined to nonadaptive reductions. In the course of proving the above results, two new AM protocols emerge for proving upper bounds on the sizes of N P sets. Whereas the known lower bound protocol on set sizes by [GoldwasserSipser] works for any N P set, the known upper bound protocol on set sizes by [AielloHastad] works in a setting where the verifier knows a random secret element (unknown to the prover) in the N P set. The new protocols we develop here, each work under different requirements than that of [AielloHastad], enlarging the settings in which it is possible to prove upper bounds on N P set size.
Statistical ZeroKnowledge Arguments for NP from Any OneWay
 ELECTRONIC COLLOQUIUM ON COMPUTATIONAL COMPLEXITY
, 2006
"... We show that every language in NP has a statistical zeroknowledge argument system under the (minimal) complexity assumption that oneway functions exist. In such protocols, even a computationally unbounded verifier cannot learn anything other than the fact that the assertion being proven is true, w ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
We show that every language in NP has a statistical zeroknowledge argument system under the (minimal) complexity assumption that oneway functions exist. In such protocols, even a computationally unbounded verifier cannot learn anything other than the fact that the assertion being proven is true, whereas a polynomialtime prover cannot convince the verifier to accept a false assertion except with negligible probability. This resolves an open question posed by Naor, Ostrovsky, Venkatesan, and Yung (CRYPTO ‘92, J. Cryptology ‘98). Departing from previous works on this problem, we do not construct standard statistically hiding commitments from any oneway function. Instead, we construct a relaxed variant of commitment schemes called “1outof2binding commitments,” recently introduced by Nguyen and Vadhan (STOC ‘06).
OneWay Permutations, Interactive Hashing and StatisticallyHiding Commitments
, 2007
"... We present a lower bound on the round complexity of a natural class of blackbox constructions of statistically hiding commitments from oneway permutations. This implies a Ω ( n) lower bound on the log n round complexity of a computational form of interactive hashing, which has been used to constr ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
We present a lower bound on the round complexity of a natural class of blackbox constructions of statistically hiding commitments from oneway permutations. This implies a Ω ( n) lower bound on the log n round complexity of a computational form of interactive hashing, which has been used to construct statistically hiding commitments (and related primitives) from various classes of oneway functions, starting with the work of Naor, Ostrovsky, Venkatesan and Yung (J. Cryptology, 1998). Our lower bound matches the round complexity of the protocol studied by Naor et al.
Inaccessible Entropy
"... We put forth a new computational notion of entropy, which measures the (in)feasibility of sampling high entropy strings that are consistent with a given protocol. Specifically, we say that the i’th round of a protocol (A, B) has accessible entropy at most k, if no polynomialtime strategy A ∗ can ge ..."
Abstract

Cited by 16 (8 self)
 Add to MetaCart
(Show Context)
We put forth a new computational notion of entropy, which measures the (in)feasibility of sampling high entropy strings that are consistent with a given protocol. Specifically, we say that the i’th round of a protocol (A, B) has accessible entropy at most k, if no polynomialtime strategy A ∗ can generate messages for A such that the entropy of its message in the i’th round has entropy greater than k when conditioned both on prior messages of the protocol and on prior coin tosses of A ∗. We say that the protocol has inaccessible entropy if the total accessible entropy (summed over the rounds) is noticeably smaller than the real entropy of A’s messages, conditioned only on prior messages (but not the coin tosses of A). As applications of this notion, we • Give a much simpler and more efficient construction of statistically hiding commitment schemes from arbitrary oneway functions. • Prove that constantround statistically hiding commitments are necessary for constructing constantround zeroknowledge proof systems for NP that remain secure under parallel composition (assuming the existence of oneway functions). Categories and Subject Descriptors: F.0 [Theory of Computation]: General.
Statistically Hiding Commitments and Statistical ZeroKnowledge Arguments from Any OneWay Function
, 2007
"... We give a construction of statistically hiding commitment schemes (ones where the hiding property holds against even computationally unbounded adversaries) under the minimal complexity assumption that oneway functions exist. Consequently, oneway functions suffice to give statistical zeroknowledge ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
(Show Context)
We give a construction of statistically hiding commitment schemes (ones where the hiding property holds against even computationally unbounded adversaries) under the minimal complexity assumption that oneway functions exist. Consequently, oneway functions suffice to give statistical zeroknowledge arguments for any NP statement (whereby even a computationally unbounded adversarial verifier learns nothing other than the fact the assertion being proven is true, and a polynomialtime adversarial prover cannot convince the verifier of a false statement). These results resolve an open question posed by Naor, Ostrovsky, Venkatesan, and Yung (CRYPTO ‘92, J. Cryptology ‘98).
A new interactive hashing theorem
 In Proceedings of the 22nd Annual IEEE Conference on Computational Complexity
, 2007
"... Interactive hashing, introduced by Naor, Ostrovsky, Venkatesan and Yung (CRYPTO ’92), plays an important role in many cryptographic protocols. In particular, it is a major component in all known constructions of statistically hiding and computationally binding commitment schemes and of zeroknowledg ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
Interactive hashing, introduced by Naor, Ostrovsky, Venkatesan and Yung (CRYPTO ’92), plays an important role in many cryptographic protocols. In particular, it is a major component in all known constructions of statistically hiding and computationally binding commitment schemes and of zeroknowledge arguments based on general oneway permutations and on oneway functions. Interactive hashing with respect to a oneway permutation f, is a twoparty protocol that enables a sender that knows y = f(x) to transfer a random hash z = h(y) to a receiver. The receiver is guaranteed that the sender is committed to y (in the sense that it cannot come up with x and x ′ such that f(x) � = f(x ′), but h(f(x)) = h(f(x ′)) = z). The sender is guaranteed that the receiver does not learn any additional information on y. In particular, when h is a twotoone hash function, the receiver does not learn which of the two preimages {y, y ′ } = h −1 (z) is the one the sender can invert with respect to f. This paper reexamines the notion of interactive hashing. We give an alternative proof for the Naor et al. protocol, which seems to us significantly simpler and more intuitive than the original one. Moreover, the new proof achieves much better parameters (in terms of how security
Concurrent NonMalleable Zero Knowledge
 In Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science
, 2006
"... We provide the first construction of a concurrent and nonmalleable zero knowledge argument for every language inNP. We stress that our construction is in the plain model with no common random string, trusted parties, or superpolynomial simulation. That is, we construct a zero knowledge protocol Π ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
(Show Context)
We provide the first construction of a concurrent and nonmalleable zero knowledge argument for every language inNP. We stress that our construction is in the plain model with no common random string, trusted parties, or superpolynomial simulation. That is, we construct a zero knowledge protocol Π such that for every polynomialtime adversary that can adaptively and concurrently schedule polynomially many executions of Π, and corrupt some of the verifiers and some of the provers in these sessions, there is a polynomialtime simulator that can simulate a transcript of the entire execution, along with the witnesses for all statements proven by a corrupt prover to an honest verifier. Our security model is the traditional model for concurrent zero knowledge, where the statements to be proven by the honest provers are fixed in advance and do not depend on the previous history (but can be correlated with each other); corrupted provers, of course, can chose the statements adaptively. We also prove that there exists some functionality F (a combination of zero knowledge and oblivious transfer) such that it is impossible to obtain a concurrent nonmalleable protocol for F in this model. Previous impossibility results for composable protocols ruled out existence of protocols for a wider class of functionalities (including zero knowledge!) but only if these protocols were required to remain secure when executed concurrently with arbitrarily chosen different protocols (Lindell, FOCS 2003) or if these protocols were required to remain secure when the honest parties ’ inputs in each execution are chosen adaptively based on the results of previous executions