Results 1  10
of
128
Noisetolerant learning, the parity problem, and the statistical query model
 J. ACM
"... We describe a slightly subexponential time algorithm for learning parity functions in the presence of random classification noise. This results in a polynomialtime algorithm for the case of parity functions that depend on only the first O(log n log log n) bits of input. This is the first known ins ..."
Abstract

Cited by 164 (2 self)
 Add to MetaCart
We describe a slightly subexponential time algorithm for learning parity functions in the presence of random classification noise. This results in a polynomialtime algorithm for the case of parity functions that depend on only the first O(log n log log n) bits of input. This is the first known instance of an efficient noisetolerant algorithm for a concept class that is provably not learnable in the Statistical Query model of Kearns [7]. Thus, we demonstrate that the set of problems learnable in the statistical query model is a strict subset of those problems learnable in the presence of noise in the PAC model. In codingtheory terms, what we give is a poly(n)time algorithm for decoding linear k × n codes in the presence of random noise for the case of k = clog n log log n for some c> 0. (The case of k O(log n) is trivial since one can just individually check each of the 2 k possible messages and choose the one that yields the closest codeword.) A natural extension of the statistical query model is to allow queries about statistical properties that involve ttuples of examples (as opposed to single examples). The second result of this paper is to show that any class of functions learnable (strongly or weakly) with twise queries for t = O(log n) is also weakly learnable with standard unary queries. Hence this natural extension to the statistical query model does not increase the set of weakly learnable functions. 1.
IDBased Blind Signature and Ring Signature from Pairings
 Proc. of Asiacrpt2002, LNCS 2501
, 2002
"... Recently the bilinear pairing such as Weil pairing or Tate pairing on elliptic curves and hyperelliptic curves have been found various applications in cryptography. Several identitybased (simply IDbased) cryptosystems using bilinear pairings of elliptic curves or hyperelliptic curves were presente ..."
Abstract

Cited by 98 (13 self)
 Add to MetaCart
(Show Context)
Recently the bilinear pairing such as Weil pairing or Tate pairing on elliptic curves and hyperelliptic curves have been found various applications in cryptography. Several identitybased (simply IDbased) cryptosystems using bilinear pairings of elliptic curves or hyperelliptic curves were presented. Blind signature and ring signature are very useful to provide the user's anonymity and the signer's privacy. They are playing an important role in building ecommerce. In this paper, we firstly propose an IDbased blind signature scheme and an IDbased ring signature scheme, both of which are based on the bilinear pairings. Also we analyze their security and e#ciency.
Latticebased Cryptography
, 2008
"... In this chapter we describe some of the recent progress in latticebased cryptography. Latticebased cryptographic constructions hold a great promise for postquantum cryptography, as they enjoy very strong security proofs based on worstcase hardness, relatively efficient implementations, as well a ..."
Abstract

Cited by 67 (5 self)
 Add to MetaCart
(Show Context)
In this chapter we describe some of the recent progress in latticebased cryptography. Latticebased cryptographic constructions hold a great promise for postquantum cryptography, as they enjoy very strong security proofs based on worstcase hardness, relatively efficient implementations, as well as great simplicity. In addition, latticebased cryptography is believed to be secure against quantum computers. Our focus here
Security Bounds for the Design of CodeBased Cryptosystems
, 2009
"... Codebased cryptography is often viewed as an interesting “PostQuantum” alternative to the classical number theory cryptography. Unlike many other such alternatives, it has the convenient advantage of having only a few, well identified, attack algorithms. However, improvements to these algorithms h ..."
Abstract

Cited by 55 (5 self)
 Add to MetaCart
Codebased cryptography is often viewed as an interesting “PostQuantum” alternative to the classical number theory cryptography. Unlike many other such alternatives, it has the convenient advantage of having only a few, well identified, attack algorithms. However, improvements to these algorithms have made their effective complexity quite complex to compute. We give here some lower bounds on the work factor of idealized versions of these algorithms, taking into account all possible tweaks which could improve their practical complexity. The aim of this article is to help designers select durably secure parameters.
SWIFFT: A Modest Proposal for FFT Hashing
"... We propose SWIFFT, a collection of compression functions that are highly parallelizable and admit very efficient implementations on modern microprocessors. The main technique underlying our functions is a novel use of the Fast Fourier Transform (FFT) to achieve “diffusion, ” together with a linear ..."
Abstract

Cited by 51 (17 self)
 Add to MetaCart
(Show Context)
We propose SWIFFT, a collection of compression functions that are highly parallelizable and admit very efficient implementations on modern microprocessors. The main technique underlying our functions is a novel use of the Fast Fourier Transform (FFT) to achieve “diffusion, ” together with a linear combination to achieve compression and “confusion. ” We provide a detailed security analysis of concrete instantiations, and give a highperformance software implementation that exploits the inherent parallelism of the FFT algorithm. The throughput of our implementation is competitive with that of SHA256, with additional parallelism yet to be exploited. Our functions are set apart from prior proposals (having comparable efficiency) by a supporting asymptotic security proof: it can be formally proved that finding a collision in a randomlychosen function from the family (with noticeable probability) is at least as hard as finding short vectors in cyclic/ideal lattices in the worst case.
Efficient IDBased Blind Signature and Proxy Signature
 In Proceedings of ACISP 2003, LNCS 2727
, 2003
"... Abstract. Blind signature and proxy signature are very important technologies in secure ecommerce. Identitybased (simply IDbased) public key cryptosystem can be a good alternative for certificatebased public key setting, especially when efficient key management and moderate security are required ..."
Abstract

Cited by 36 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Blind signature and proxy signature are very important technologies in secure ecommerce. Identitybased (simply IDbased) public key cryptosystem can be a good alternative for certificatebased public key setting, especially when efficient key management and moderate security are required. In this paper, we propose a new IDbased blind signature scheme and an IDbased partial delegation proxy signature scheme with warrant based on the bilinear pairings. Also we analyze their security and efficiency. We claim that our new blind signature scheme is more efficient than Zhang and Kim’s scheme [27] in Asiacrypt2002.
Improved fast syndrome based cryptographic hash functions
 in Proceedings of ECRYPT Hash Workshop 2007 (2007). URL: http://wwwroc.inria.fr/secret/Matthieu.Finiasz
"... Abstract. Recently, some collisions have been exposed for a variety of cryptographic hash functions [19] including some of the most widely used today. Many other hash functions using similar constrcutions can however still be considered secure. Nevertheless, this has drawn attention on the need for ..."
Abstract

Cited by 34 (6 self)
 Add to MetaCart
(Show Context)
Abstract. Recently, some collisions have been exposed for a variety of cryptographic hash functions [19] including some of the most widely used today. Many other hash functions using similar constrcutions can however still be considered secure. Nevertheless, this has drawn attention on the need for new hash function designs. In this article is presented a familly of secure hash functions, whose security is directly related to the syndrome decoding problem from the theory of errorcorrecting codes. Taking into account the analysis by Coron and Joux [4] based on Wagner’s generalized birthday algorithm [18] we study the asymptotical security of our functions. We demonstrate that this attack is always exponential in terms of the length of the hash value. We also study the workfactor of this attack, along with other attacks from coding theory, for non asymptotic range, i.e. for practical values. Accordingly, we propose a few sets of parameters giving a good security and either a faster hashing or a shorter desciption for the function. Key Words: cryptographic hash functions, provable security, syndrome decoding, NPcompleteness, Wagner’s generalized birthday problem.
Four Practical Attacks for "Optimistic Mixing for ExitPolls"
, 2003
"... Golle, Zhong, Boneh, Jakobsson, and Juels [10] recently presented a very efficient mixnet, that they claim to be both robust, and secure. We present four practical attacks for their mixnet, and break both its privacy and robustness. Each attack exploits and illustrates a separate weakness of the p ..."
Abstract

Cited by 28 (3 self)
 Add to MetaCart
Golle, Zhong, Boneh, Jakobsson, and Juels [10] recently presented a very efficient mixnet, that they claim to be both robust, and secure. We present four practical attacks for their mixnet, and break both its privacy and robustness. Each attack exploits and illustrates a separate weakness of the protocol.
The parity problem in the presence of noise, decoding random linear codes, and the subset sum problem
 In RANDOM
, 2005
"... Abstract. In [2], Blum et al. demonstrated the first subexponential algorithm for learning the parity function in the presence of noise. They solved the lengthn parity problem in time 2 O(n / log n) but it required the availability of 2 O(n / log n) labeled examples. As an open problem, they asked ..."
Abstract

Cited by 27 (2 self)
 Add to MetaCart
(Show Context)
Abstract. In [2], Blum et al. demonstrated the first subexponential algorithm for learning the parity function in the presence of noise. They solved the lengthn parity problem in time 2 O(n / log n) but it required the availability of 2 O(n / log n) labeled examples. As an open problem, they asked whether there exists a 2 o(n) algorithm for the lengthn parity problem that uses only poly(n) labeled examples. In this work, we provide a positive answer to this question. We show that there is an algorithm that solves the lengthn parity problem in time 2 O(n / log log n) using n 1+ɛ labeled examples. This result immediately gives us a subexponential algorithm for decoding n × n 1+ɛ random binary linear codes (i.e. codes where the messages are n bits and the codewords are n 1+ɛ bits) in the presence of random noise. We are also able to extend the same techniques to provide a subexponential algorithm for dense instances of the random subset sum problem. 1
Faster Correlation Attack on Bluetooth Keystream Generator E0
 Advances on Cryptography  CRYPTO 2004, Lecture Notes in Computer Science
, 2004
"... Abstract. We study both distinguishing and keyrecovery attacks against E0, the keystream generator used in Bluetooth by means of correlation. First, a powerful computation method of correlations is formulated by a recursive expression, which makes it easier to calculate correlations of the finite s ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
(Show Context)
Abstract. We study both distinguishing and keyrecovery attacks against E0, the keystream generator used in Bluetooth by means of correlation. First, a powerful computation method of correlations is formulated by a recursive expression, which makes it easier to calculate correlations of the finite state machine output sequences up to 26 bits for E0 and allows us to verify the two known correlations to be the largest for the first time. Second, we apply the concept of convolution to the analysis of the distinguisher based on all correlations, and propose an efficient distinguisher due to the linear dependency of the largest correlations. Last, we propose a novel maximum likelihood decoding algorithm based on fast Walsh transform to recover the closest codeword for any linear code of dimension L and length n. It requires time O(n + L · 2 L) and memory min(n, 2 L). This can speed up many attacks such as fast correlation attacks. We apply it to E0, and our best keyrecovery attack works in 2 39 time given 2 39 consecutive bits after O(2 37) precomputation. This is the best known attack against E0 so far. 1