Results 1  10
of
34
Concurrent ZeroKnowledge
 IN 30TH STOC
, 1999
"... Concurrent executions of a zeroknowledge protocol by a single prover (with one or more verifiers) may leak information and may not be zeroknowledge in toto. In this paper, we study the problem of maintaining zeroknowledge We introduce the notion of an (; ) timing constraint: for any two proces ..."
Abstract

Cited by 177 (18 self)
 Add to MetaCart
Concurrent executions of a zeroknowledge protocol by a single prover (with one or more verifiers) may leak information and may not be zeroknowledge in toto. In this paper, we study the problem of maintaining zeroknowledge We introduce the notion of an (; ) timing constraint: for any two processors P1 and P2 , if P1 measures elapsed time on its local clock and P2 measures elapsed time on its local clock, and P2 starts after P1 does, then P2 will finish after P1 does. We show that if the adversary is constrained by an (; ) assumption then there exist fourround almost concurrent zeroknowledge interactive proofs and perfect concurrent zeroknowledge arguments for every language in NP . We also address the more specific problem of Deniable Authentication, for which we propose several particularly efficient solutions. Deniable Authentication is of independent interest, even in the sequential case; our concurrent solutions yield sequential solutions without recourse to timing, i.e., in the standard model.
On the Existence of 3Round ZeroKnowledge Protocols
 In Crypto98, Springer LNCS 1462
, 1999
"... In this paper, we construct a 3round zeroknowledge protocol for any NP language. Our protocol achieves weaker notions of zeroknowledge than blackbox simulation zeroknowledge. Therefore, our result does not contradict the triviality result of Goldreich and Krawczyk [GoKr96] which shows that 3ro ..."
Abstract

Cited by 66 (2 self)
 Add to MetaCart
(Show Context)
In this paper, we construct a 3round zeroknowledge protocol for any NP language. Our protocol achieves weaker notions of zeroknowledge than blackbox simulation zeroknowledge. Therefore, our result does not contradict the triviality result of Goldreich and Krawczyk [GoKr96] which shows that 3round blackbox simulation zeroknowledge exist only for BPP languages. Our main contribution is to provide a nonblackbox simulation technique. Whether there exists such a simulation technique was a major open problem in the theory of zeroknowledge. Our simulation technique is based on a nonstandard computational assumption related to the Di#eHellman problem, which was originally proposed by Damgard [Da91]. This assumption, which we call the DA1, says that, given randomly chosen instance of the discrete logarithm problem (p, q, g, g a ), it is infeasible to compute (B, X) such that X = B a mod p without knowing the value b satisfying B = g b mod p. Our protocol achieves di#erent no...
Efficient ZeroKnowledge Proofs of Knowledge Without Intractability Assumptions
, 2000
"... We initiate the investigation of the class of relations that admit extremely efficient perfect zero knowledge proofs of knowledge: constant number of rounds, communication linear in the length of the statement and the witness, and negligible knowledge error. In its most general incarnation, our ..."
Abstract

Cited by 41 (0 self)
 Add to MetaCart
We initiate the investigation of the class of relations that admit extremely efficient perfect zero knowledge proofs of knowledge: constant number of rounds, communication linear in the length of the statement and the witness, and negligible knowledge error. In its most general incarnation, our result says that for relations that have a particular threemove honestverifier zeroknowledge (HVZK) proof of knowledge, and which admit a particular threemove HVZK proof of knowledge for an associated commitment relation, perfect zero knowledge (against a general verifier) can be achieved essentially for free, even when proving statements on several instances combined under under monotone function composition. In addition, perfect zeroknowledge is achieved with an optimal 4moves. Instantiations of our main protocol lead to efficient perfect ZK proofs of knowledge of discrete logarithms and RSAroots, or more generally, qoneway group homomorphisms. None of our results rely...
Does Parallel Repetition Lower the Error in Computationally Sound Protocols
 In Proceedings of 38th Annual Symposium on Foundations of Computer Science, IEEE
, 1997
"... Whether or not parallel repetition lowers the error has been a fundamental question in the theory of protocols, with applications in many di erent areas. It is well known that parallel repetition reduces the error at an exponential rate in interactive proofs and ArthurMerlin games. It seems to have ..."
Abstract

Cited by 41 (7 self)
 Add to MetaCart
Whether or not parallel repetition lowers the error has been a fundamental question in the theory of protocols, with applications in many di erent areas. It is well known that parallel repetition reduces the error at an exponential rate in interactive proofs and ArthurMerlin games. It seems to have been taken for granted that the same is true in arguments, or other proofs where the soundness only holds with respect to computationally bounded parties. We show that this is not the case. Surprisingly, parallel repetition can actually fail in this setting. We present fourround protocols whose error does not decrease under parallel repetition. This holds for any (polynomial) number of repetitions. These protocols exploit nonmalleable encryption and can be based on any trapdoor permutation. On the other hand we show that for threeround protocols the error does go down exponentially fast. The question of parallel error reduction is particularly important when the protocol is used in cryptographic settings like identi cation, and the error represent the probability that an intruder succeeds.
Concurrent ZeroKnowledge With Timing, Revisited
, 2002
"... Following Dwork, Naor, and Sahai (30th STOC, 1998), we consider concurrent execution of protocols in a semisynchronized network. Specifically, we assume that each party holds a local clock such that a constant bound on the relative rates of these clocks is apriori known, and consider protocols tha ..."
Abstract

Cited by 34 (0 self)
 Add to MetaCart
Following Dwork, Naor, and Sahai (30th STOC, 1998), we consider concurrent execution of protocols in a semisynchronized network. Specifically, we assume that each party holds a local clock such that a constant bound on the relative rates of these clocks is apriori known, and consider protocols that employ timedriven operations (i.e., timeout incoming messages and delay outgoing messages). We show that the constantround zeroknowledge proof for N P of Goldreich and Kahan (Jour. of Crypto., 1996) preserves its security when polynomiallymany independent copies are executed concurrently under the above timing model. We stress that our main result establishes zeroknowledge of interactive proofs, whereas the results of Dwork et. al. are either for zeroknowledge arguments or for a weak notion of zeroknowledge (called fflknowledge) proofs.
A Note on Negligible Functions
 Journal of Cryptology
, 2002
"... The notion of a negligible function is used in theoretical cryptography to formalize the notion of a function asymptotically "too small to matter." We claim the issue that really arises is what it might mean for a sequence of functions to be "negligible." We consider (and define) ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
The notion of a negligible function is used in theoretical cryptography to formalize the notion of a function asymptotically "too small to matter." We claim the issue that really arises is what it might mean for a sequence of functions to be "negligible." We consider (and define) two such notions, and prove them equivalent. Roughly, this enables us to say that any cryptographic primitive has a specific associated "security level." In particular we can say this for any oneway function. We can also reconcile different definitions of negligible error arguments and computational proofs of knowledge that have appeared in the literature. Although there are some cryptographic consequences, the main result is something purely about negligible functions. EMail: mihir@cs.ucsd.edu. URL: http://wwwcse.ucsd.edu/users/mihir. Supported in part by NSF CAREER Award CCR9624439 and a Packard Foundation Fellowship in Science and Engineering. Contents 1 Introduction 3 1.1 The issue for oneway fu...
Signcryption and its applications in efficient public key solutions
 In Proceedings of ISW ’97, volume 1396 of LNCS
, 1997
"... Abstract. Signcryption is a new paradigm in public key cryptography that simultaneously fulfills both the functions of digital signature and public key encryption in a logically single step, and with a cost significantly lower than that required by the traditional “signature followed by encryption ” ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
(Show Context)
Abstract. Signcryption is a new paradigm in public key cryptography that simultaneously fulfills both the functions of digital signature and public key encryption in a logically single step, and with a cost significantly lower than that required by the traditional “signature followed by encryption ” approach. This paper summarizes currently known construction methods for signcryption, carries out a comprehensive comparison between signcryption and “signature followed by encryption”, and suggests a number of applications of signcryption in the search of efficient security solutions based on public key cryptography.
A Technical Overview of Digital Credentials
, 2002
"... Applications that involve the electronic transfer of credentials, value tokens, profiles, and other sensitive information are quickly gaining momentum. Traditional attempts to introduce electronic authentication, such as PKI and biometric verification, expose organizations to potentially unlimite ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
Applications that involve the electronic transfer of credentials, value tokens, profiles, and other sensitive information are quickly gaining momentum. Traditional attempts to introduce electronic authentication, such as PKI and biometric verification, expose organizations to potentially unlimited liability, lead to consumer fear, and stifle the adoption of new systems. To overcome these barriers, innovative solutions are needed that address the entire spectrum of security and privacy interests for all parties involved.
OneWay Permutations, Interactive Hashing and StatisticallyHiding Commitments
 In S. Vadhan (Ed.): Theory of Cryptography (TCC) 2007, LNCS 4392
, 2007
"... Abstract. We present a lower bound on the round complexity of a natural class of blackbox constructions of statistically hiding commitments from oneway permutations. This implies a Ω ( n logn) lower bound on the round complexity of a computational form of interactive hashing, which has been used t ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
Abstract. We present a lower bound on the round complexity of a natural class of blackbox constructions of statistically hiding commitments from oneway permutations. This implies a Ω ( n logn) lower bound on the round complexity of a computational form of interactive hashing, which has been used to construct statistically hiding commitments (and related primitives) from various classes of oneway functions, starting with the work of Naor, Ostrovsky, Venkatesan and Yung (J. Cryptology, 1998). Our lower bound matches the round complexity of the protocol studied by Naor et al.
Shortened digital signature, signcryption and compact and unforgeable Key agreement schemes
 IEEE P1363 Standard for Public Key Cryptography : Additional Techniques
"... ..."