Results 1  10
of
105
On Lattices, Learning with Errors, Random Linear Codes, and Cryptography
 In STOC
, 2005
"... Our main result is a reduction from worstcase lattice problems such as SVP and SIVP to a certain learning problem. This learning problem is a natural extension of the ‘learning from parity with error’ problem to higher moduli. It can also be viewed as the problem of decoding from a random linear co ..."
Abstract

Cited by 364 (6 self)
 Add to MetaCart
(Show Context)
Our main result is a reduction from worstcase lattice problems such as SVP and SIVP to a certain learning problem. This learning problem is a natural extension of the ‘learning from parity with error’ problem to higher moduli. It can also be viewed as the problem of decoding from a random linear code. This, we believe, gives a strong indication that these problems are hard. Our reduction, however, is quantum. Hence, an efficient solution to the learning problem implies a quantum algorithm for SVP and SIVP. A main open question is whether this reduction can be made classical. We also present a (classical) publickey cryptosystem whose security is based on the hardness of the learning problem. By the main result, its security is also based on the worstcase quantum hardness of SVP and SIVP. Previous latticebased publickey cryptosystems such as the one by Ajtai and Dwork were based only on uniqueSVP, a special case of SVP. The new cryptosystem is much more efficient than previous cryptosystems: the public key is of size Õ(n2) and encrypting a message increases its size by a factor of Õ(n) (in previous cryptosystems these values are Õ(n4) and Õ(n2), respectively). In fact, under the assumption that all parties share a random bit string of length Õ(n2), the size of the public key can be reduced to Õ(n). 1
PayWord and MicroMint: two simple micropayment schemes
 CryptoBytes
, 1996
"... 1 Introduction We present two simple micropayment schemes, "PayWord " and "MicroMint, " for making small purchases over the Internet. We were inspired to work on this problem by DEC's "Millicent " scheme[10]. Surveys of some electronic payme ..."
Abstract

Cited by 263 (4 self)
 Add to MetaCart
(Show Context)
1 Introduction We present two simple micropayment schemes, &quot;PayWord &quot; and &quot;MicroMint, &quot; for making small purchases over the Internet. We were inspired to work on this problem by DEC's &quot;Millicent &quot; scheme[10]. Surveys of some electronic payment schemes can be found in HallamBaker [6], Schneier[16], and Wayner[18]. Our main goal is to minimize the number of publickey operations required per payment, using hash operations instead whenever possible. As a rough guide, hash functions are about 100 times faster than RSA signature verification, and about 10,000 times faster than RSA signature generation: on a typical workstation, one can sign two messages per second, verify 200 signatures per second, and compute 20,000 hash function values per second.
Numbertheoretic constructions of efficient pseudorandom functions
 In 38th Annual Symposium on Foundations of Computer Science
, 1997
"... ..."
Authenticating Pervasive Devices with Human Protocols
, 2005
"... Abstract. Forgery and counterfeiting are emerging as serious security risks in lowcost pervasive computing devices. These devices lack the computational, storage, power, and communication resources necessary for most cryptographic authentication schemes. Surprisingly, lowcost pervasive devices lik ..."
Abstract

Cited by 167 (5 self)
 Add to MetaCart
(Show Context)
Abstract. Forgery and counterfeiting are emerging as serious security risks in lowcost pervasive computing devices. These devices lack the computational, storage, power, and communication resources necessary for most cryptographic authentication schemes. Surprisingly, lowcost pervasive devices like Radio Frequency Identification (RFID) tags share similar capabilities with another weak computing device: people. These similarities motivate the adoption of techniques from humancomputer security to the pervasive computing setting. This paper analyzes a particular humantocomputer authentication protocol designed by Hopper and Blum (HB), and shows it to be practical for lowcost pervasive devices. We offer an improved, concrete proof of security for the HB protocol against passive adversaries. This paper also offers a new, augmented version of the HB protocol, named HB +, that is secure against active adversaries. The HB + protocol is a novel, symmetric authentication protocol with a simple, lowcost implementation. We prove the security of the HB + protocol against active adversaries based on the hardness of the Learning Parity with Noise (LPN) problem.
Publickey cryptosystems from the worstcase shortest vector problem
, 2008
"... We construct publickey cryptosystems that are secure assuming the worstcase hardness of approximating the length of a shortest nonzero vector in an ndimensional lattice to within a small poly(n) factor. Prior cryptosystems with worstcase connections were based either on the shortest vector probl ..."
Abstract

Cited by 152 (22 self)
 Add to MetaCart
We construct publickey cryptosystems that are secure assuming the worstcase hardness of approximating the length of a shortest nonzero vector in an ndimensional lattice to within a small poly(n) factor. Prior cryptosystems with worstcase connections were based either on the shortest vector problem for a special class of lattices (Ajtai and Dwork, STOC 1997; Regev, J. ACM 2004), or on the conjectured hardness of lattice problems for quantum algorithms (Regev, STOC 2005). Our main technical innovation is a reduction from certain variants of the shortest vector problem to corresponding versions of the “learning with errors” (LWE) problem; previously, only a quantum reduction of this kind was known. In addition, we construct new cryptosystems based on the search version of LWE, including a very natural chosen ciphertextsecure system that has a much simpler description and tighter underlying worstcase approximation factor than prior constructions.
Secure human identification protocols
 In Asiacrypt
, 2001
"... Abstract. One interesting and important challenge for the cryptologic community is that of providing secure authentication and identification for unassisted humans. There are a range of protocols for secure identification which require various forms of trusted hardware or software, aimed at protecti ..."
Abstract

Cited by 127 (3 self)
 Add to MetaCart
Abstract. One interesting and important challenge for the cryptologic community is that of providing secure authentication and identification for unassisted humans. There are a range of protocols for secure identification which require various forms of trusted hardware or software, aimed at protecting privacy and financial assets. But how do we verify our identity, securely, when we don’t have or don’t trust our smart card, palmtop, or laptop? In this paper, we provide definitions of what we believe to be reasonable goals for secure human identification. We demonstrate that existing solutions do not meet these reasonable definitions. Finally, we provide solutions which demonstrate the feasibility of the security conditions attached to our definitions, but which are impractical for use by humans. 1
On ideal lattices and learning with errors over rings
 In Proc. of EUROCRYPT, volume 6110 of LNCS
, 2010
"... The “learning with errors ” (LWE) problem is to distinguish random linear equations, which have been perturbed by a small amount of noise, from truly uniform ones. The problem has been shown to be as hard as worstcase lattice problems, and in recent years it has served as the foundation for a pleth ..."
Abstract

Cited by 125 (18 self)
 Add to MetaCart
The “learning with errors ” (LWE) problem is to distinguish random linear equations, which have been perturbed by a small amount of noise, from truly uniform ones. The problem has been shown to be as hard as worstcase lattice problems, and in recent years it has served as the foundation for a plethora of cryptographic applications. Unfortunately, these applications are rather inefficient due to an inherent quadratic overhead in the use of LWE. A main open question was whether LWE and its applications could be made truly efficient by exploiting extra algebraic structure, as was done for latticebased hash functions (and related primitives). We resolve this question in the affirmative by introducing an algebraic variant of LWE called ringLWE, and proving that it too enjoys very strong hardness guarantees. Specifically, we show that the ringLWE distribution is pseudorandom, assuming that worstcase problems on ideal lattices are hard for polynomialtime quantum algorithms. Applications include the first truly practical latticebased publickey cryptosystem with an efficient security reduction; moreover, many of the other applications of LWE can be made much more efficient through the use of ringLWE. 1
Simultaneous hardcore bits and cryptography against memory attacks
 IN TCC
, 2009
"... This paper considers two questions in cryptography. Cryptography Secure Against Memory Attacks. A particularly devastating sidechannel attack against cryptosystems, termed the “memory attack”, was proposed recently. In this attack, a significant fraction of the bits of a secret key of a cryptograp ..."
Abstract

Cited by 116 (11 self)
 Add to MetaCart
(Show Context)
This paper considers two questions in cryptography. Cryptography Secure Against Memory Attacks. A particularly devastating sidechannel attack against cryptosystems, termed the “memory attack”, was proposed recently. In this attack, a significant fraction of the bits of a secret key of a cryptographic algorithm can be measured by an adversary if the secret key is ever stored in a part of memory which can be accessed even after power has been turned off for a short amount of time. Such an attack has been shown to completely compromise the security of various cryptosystems in use, including the RSA cryptosystem and AES. We show that the publickey encryption scheme of Regev (STOC 2005), and the identitybased encryption scheme of Gentry, Peikert and Vaikuntanathan (STOC 2008) are remarkably robust against memory attacks where the adversary can measure a large fraction of the bits of the secretkey, or more generally, can compute an arbitrary function of the secretkey of bounded output length. This is done without increasing the size of the secretkey, and without introducing any
On the learnability of discrete distributions
 In The 25th Annual ACM Symposium on Theory of Computing
, 1994
"... We introduce and investigate a new model of learning probability distributions from independent draws. Our model is inspired by the popular Probably Approximately Correct (PAC) model for learning boolean functions from labeled ..."
Abstract

Cited by 116 (12 self)
 Add to MetaCart
(Show Context)
We introduce and investigate a new model of learning probability distributions from independent draws. Our model is inspired by the popular Probably Approximately Correct (PAC) model for learning boolean functions from labeled
Learning polynomials with queries: The highly noisy case
, 1995
"... Given a function f mapping nvariate inputs from a finite Kearns et. al. [21] (see also [27, 28, 22]). In the setting of agfieldFintoF, we consider the task of reconstructing a list nostic learning, the learner is to make no assumptions regarding of allnvariate degreedpolynomials which agree withf ..."
Abstract

Cited by 97 (17 self)
 Add to MetaCart
(Show Context)
Given a function f mapping nvariate inputs from a finite Kearns et. al. [21] (see also [27, 28, 22]). In the setting of agfieldFintoF, we consider the task of reconstructing a list nostic learning, the learner is to make no assumptions regarding of allnvariate degreedpolynomials which agree withfon a the natural phenomena underlying the input/output relationship tiny but nonnegligible fraction, , of the input space. We give a of the function, and the goal of the learner is to come up with a randomized algorithm for solving this task which accessesfas a simple explanation which best fits the examples. Therefore the black box and runs in time polynomial in1;nand exponential in best explanation may account for only part of the phenomena. d, provided is(pd=jFj). For the special case whend=1, In some situations, when the phenomena appears very irregular, we solve this problem for jFj>0. In this case the providing an explanation which fits only part of it is better than nothing. Interestingly, Kearns et. al. did not consider the use of running time of our algorithm is bounded by a polynomial queries (but rather examples drawn from an arbitrary distribuand exponential ind. Our algorithm generalizes a previously tion) as they were skeptical that queries could be of any help. known algorithm, due to Goldreich and Levin, that solves this We show that queries do seem to help (see below). task for the case whenF=GF(2)(andd=1).