Results 1  10
of
171
On the (im)possibility of obfuscating programs
 Lecture Notes in Computer Science
, 2001
"... Informally, an obfuscator O is an (efficient, probabilistic) “compiler ” that takes as input a program (or circuit) P and produces a new program O(P) that has the same functionality as P yet is “unintelligible ” in some sense. Obfuscators, if they exist, would have a wide variety of cryptographic an ..."
Abstract

Cited by 348 (24 self)
 Add to MetaCart
Informally, an obfuscator O is an (efficient, probabilistic) “compiler ” that takes as input a program (or circuit) P and produces a new program O(P) that has the same functionality as P yet is “unintelligible ” in some sense. Obfuscators, if they exist, would have a wide variety of cryptographic and complexitytheoretic applications, ranging from software protection to homomorphic encryption to complexitytheoretic analogues of Rice’s theorem. Most of these applications are based on an interpretation of the “unintelligibility ” condition in obfuscation as meaning that O(P) is a “virtual black box, ” in the sense that anything one can efficiently compute given O(P), one could also efficiently compute given oracle access to P. In this work, we initiate a theoretical investigation of obfuscation. Our main result is that, even under very weak formalizations of the above intuition, obfuscation is impossible. We prove this by constructing a family of efficient programs P that are unobfuscatable in the sense that (a) given any efficient program P ′ that computes the same function as a program P ∈ P, the “source code ” P can be efficiently reconstructed, yet (b) given oracle access to a (randomly selected) program P ∈ P, no efficient algorithm can reconstruct P (or even distinguish a certain bit in the code from random) except with negligible probability. We extend our impossibility result in a number of ways, including even obfuscators that (a) are not necessarily computable in polynomial time, (b) only approximately preserve the functionality, and (c) only need to work for very restricted models of computation (TC 0). We also rule out several potential applications of obfuscators, by constructing “unobfuscatable” signature schemes, encryption schemes, and pseudorandom function families.
Searchable encryption revisited: Consistency properties, relation to anonymous IBE, and extensions
, 2005
"... We identify and fill some gaps with regard to consistency (the extent to which false positives are produced) for publickey encryption with keyword search (PEKS). We define computational and statistical relaxations of the existing notion of perfect consistency, show that the scheme of [7] is compu ..."
Abstract

Cited by 143 (3 self)
 Add to MetaCart
We identify and fill some gaps with regard to consistency (the extent to which false positives are produced) for publickey encryption with keyword search (PEKS). We define computational and statistical relaxations of the existing notion of perfect consistency, show that the scheme of [7] is computationally consistent, and provide a new scheme that is statistically consistent. We also provide a transform of an anonymous IBE scheme to a secure PEKS scheme that, unlike the previous one, guarantees consistency. Finally we suggest three extensions of the basic notions considered here, namely anonymous HIBE, publickey encryption with temporary keyword search, and identitybased encryption
On private scalar product computation for privacypreserving data mining
 In Proceedings of the 7th Annual International Conference in Information Security and Cryptology
, 2004
"... Abstract. In mining and integrating data from multiple sources, there are many privacy and security issues. In several different contexts, the security of the full privacypreserving data mining protocol depends on the security of the underlying private scalar product protocol. We show that two of t ..."
Abstract

Cited by 77 (4 self)
 Add to MetaCart
(Show Context)
Abstract. In mining and integrating data from multiple sources, there are many privacy and security issues. In several different contexts, the security of the full privacypreserving data mining protocol depends on the security of the underlying private scalar product protocol. We show that two of the private scalar product protocols, one of which was proposed in a leading data mining conference, are insecure. We then describe a provably private scalar product protocol that is based on homomorphic encryption and improve its efficiency so that it can also be used on massive datasets. Keywords: Privacypreserving data mining, private scalar product protocol, vertically partitioned frequent pattern mining 1
Privacypreserving classification of customer data without loss of accuracy
 Proceedings of the 5th SIAM International Conference on Data Mining
, 2005
"... Privacy has become an increasingly important issue in data mining. In this paper, we consider a scenario in which a data miner surveys a large number of customers to learn classification rules on their data, while the sensitive attributes of these customers need to be protected. Solutions have been ..."
Abstract

Cited by 57 (4 self)
 Add to MetaCart
(Show Context)
Privacy has become an increasingly important issue in data mining. In this paper, we consider a scenario in which a data miner surveys a large number of customers to learn classification rules on their data, while the sensitive attributes of these customers need to be protected. Solutions have been proposed to address this problem using randomization techniques. Such solutions exhibit a tradeoff of accuracy and privacy: the more each customer’s private information is protected, the less accurate result the miner obtains; conversely, the more accurate the result, the less privacy for the customers. In this paper, we propose a simple cryptographic approach that is efficient even in a manycustomer setting, provides strong privacy for each customer, and does not lose any accuracy as the cost of privacy. Our key technical contribution is a privacypreserving method that allows a data miner to compute frequencies of values or tuples of values in the customers ’ data, without revealing the privacysensitive part of the data. Unlike generalpurpose cryptographic protocols, this method requires no interaction between customers, and each customer only needs to send a single flow of communication to the data miner. However, we are still able to ensure that nothing about the sensitive data beyond the desired frequencies is revealed to the data miner. To illustrate the power of our approach, we use our frequency mining computation to obtain a privacypreserving naive Bayes classifier learning algorithm. Initial experimental results demonstrate the practical efficiency of our solution. We also suggest some other applications of privacypreserving frequency mining. 1
Appendonly signatures
 in International Colloquium on Automata, Languages and Programming
, 2005
"... Abstract. The strongest standard security notion for digital signature schemes is unforgeability under chosen message attacks. In practice, however, this notion can be insufficient due to “sidechannel attacks ” which exploit leakage of information about the secret internal state. In this work we pu ..."
Abstract

Cited by 53 (10 self)
 Add to MetaCart
Abstract. The strongest standard security notion for digital signature schemes is unforgeability under chosen message attacks. In practice, however, this notion can be insufficient due to “sidechannel attacks ” which exploit leakage of information about the secret internal state. In this work we put forward the notion of “leakageresilient signatures, ” which strengthens the standard security notion by giving the adversary the additional power to learn a bounded amount of arbitrary information about the secret state that was accessed during every signature generation. This notion naturally implies security against all sidechannel attacks as long as the amount of information leaked on each invocation is bounded and “only computation leaks information.” The main result of this paper is a construction which gives a (treebased, stateful) leakageresilient signature scheme based on any 3time signature scheme. The amount of information that our scheme can safely leak per signature generation is 1/3 of the information the underlying 3time signature scheme can leak in total. Signature schemes that remain secure even if a bounded total amount of information is leaked were recently constructed, hence instantiating our construction with these schemes gives the first constructions of provably secure leakageresilient signature schemes. The above construction assumes that the signing algorithm can sample truly random bits, and thus an implementation would need some special hardware (randomness gates). Simply generating this randomness using a leakageresilient streamcipher will in general not work. Our second contribution is a sound general principle to replace uniform random bits in any leakageresilient construction with pseudorandom ones: run two leakageresilient streamciphers (with independent keys) in parallel and then apply a twosource extractor to their outputs. 1
Cryptography in NC0
, 2006
"... We study the parallel timecomplexity of basic cryptographic primitives such as oneway functions (OWFs) and pseudorandom generators (PRGs). Specifically, we study the possibility of implementing instances of these primitives by NC 0 functions, namely by functions in which each output bit depends on ..."
Abstract

Cited by 48 (13 self)
 Add to MetaCart
We study the parallel timecomplexity of basic cryptographic primitives such as oneway functions (OWFs) and pseudorandom generators (PRGs). Specifically, we study the possibility of implementing instances of these primitives by NC 0 functions, namely by functions in which each output bit depends on a constant number of input bits. Despite previous efforts in this direction, there has been no convincing theoretical evidence supporting this possibility, which was posed as an open question in several previous works. We essentially settle this question by providing strong positive evidence for the possibility of cryptography in NC 0. Our main result is that every “moderately easy ” OWF (resp., PRG), say computable in NC 1, can be compiled into a corresponding OWF (resp., “lowstretch ” PRG) in which each output bit depends on at most 4 input bits. The existence of OWF and PRG in NC 1 is a relatively mild assumption, implied by most numbertheoretic or algebraic intractability assumptions commonly used in cryptography. A similar compiler can also be obtained for other cryptographic primitives such as oneway permutations, encryption, signatures, commitment, and collisionresistant hashing. Our techniques can also be applied to obtain (unconditional) constructions of “noncryptographic ” PRGs. In particular, we obtain ɛbiased generators and a PRG for spacebounded computation in which each output bit depends on only 3 input bits. Our results make use of the machinery of randomizing polynomials (Ishai and Kushilevitz, 41st FOCS, 2000), which was originally motivated by questions in the domain of informationtheoretic secure multiparty computation. 1
Passwordbased group key exchange in a constant number of rounds
 In Public Key Cryptography (PKC
, 2006
"... Abstract. With the development of grids, distributed applications are spread across multiple computing resources and require efficient security mechanisms among the processes. Although protocols for authenticated group DiffieHellman key exchange protocols seem to be the natural mechanisms for suppo ..."
Abstract

Cited by 45 (5 self)
 Add to MetaCart
(Show Context)
Abstract. With the development of grids, distributed applications are spread across multiple computing resources and require efficient security mechanisms among the processes. Although protocols for authenticated group DiffieHellman key exchange protocols seem to be the natural mechanisms for supporting these applications, current solutions are either limited by the use of public key infrastructures or by their scalability, requiring a number of rounds linear in the number of group members. To overcome these shortcomings, we propose in this paper the first provablysecure passwordbased constantround group key exchange protocol. It is based on the protocol of Burmester and Desmedt and is provablysecure in the randomoracle and idealcipher models, under the Decisional DiffieHellman assumption. The new protocol is very efficient and fully scalable since it only requires four rounds of communication and four multiexponentiations per user. Moreover, the new protocol avoids intricate authentication infrastructures by relying on passwords for authentication. 1
Lower bounds for nonblackbox zero knowledge
 In 44th FOCS
, 2003
"... We show new lower bounds and impossibility results for general (possibly nonblackbox) zeroknowledge proofs and arguments. Our main results are that, under reasonable complexity assumptions: 1. There does not exist a tworound zeroknowledge proof system with perfect completeness for an NPcomplet ..."
Abstract

Cited by 42 (7 self)
 Add to MetaCart
(Show Context)
We show new lower bounds and impossibility results for general (possibly nonblackbox) zeroknowledge proofs and arguments. Our main results are that, under reasonable complexity assumptions: 1. There does not exist a tworound zeroknowledge proof system with perfect completeness for an NPcomplete language. The previous impossibility result for tworound zero knowledge, by Goldreich and Oren (J. Cryptology, 1994) was only for the case of auxiliaryinput zeroknowledge proofs and arguments. 2. There does not exist a constantround zeroknowledge strong proof or argument of knowledge (as defined by Goldreich (2001)) for a nontrivial language. 3. There does not exist a constantround publiccoin proof system for a nontrivial language that is resettable zero knowledge. This result also extends to boundedresettable zero knowledge, in which the number of resets is a priori bounded by a polynomial in the input length and provertoverifier communication.
Private Circuits II: Keeping Secrets In Tamperable Circuits
, 2006
"... Motivated by the problem of protecting cryptographic hardware, we continue the investigation of private circuits initiated in [16]. In this work, our aim is to construct circuits that should protect the secrecy of their internal state against an adversary who may modify the values of an unbounde ..."
Abstract

Cited by 39 (4 self)
 Add to MetaCart
Motivated by the problem of protecting cryptographic hardware, we continue the investigation of private circuits initiated in [16]. In this work, our aim is to construct circuits that should protect the secrecy of their internal state against an adversary who may modify the values of an unbounded number of wires, anywhere in the circuit. In contrast, all previous works on protecting cryptographic hardware relied on an assumption that some portion of the circuit must remain completely free from tampering.