Results 1  10
of
313,746
Concurrently Secure Identification Schemes Based on the WorstCase Hardness of Lattice Problems
, 2008
"... In this paper, we show that two variants of Stern’s identification scheme [IEEE Transaction on Information Theory ’96] are provably secure against concurrent attack under the assumptions on the worstcase hardness of lattice problems. These assumptions are weaker than those for the previous lattice ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
In this paper, we show that two variants of Stern’s identification scheme [IEEE Transaction on Information Theory ’96] are provably secure against concurrent attack under the assumptions on the worstcase hardness of lattice problems. These assumptions are weaker than those for the previous lattice
Concurrently Secure Identification Schemes and Ad Hoc Anonymous Identification Schemes Based on the WorstCase Hardness of Lattice Problems
"... Abstract. In this paper, we show that two variants of Stern’s identification scheme [IEEE IT ’96] are provably secure against concurrent attacks under the assumptions on the worstcase hardness of lattice problems. These assumptions are weaker than those for the existing schemes of Micciancio and Va ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. In this paper, we show that two variants of Stern’s identification scheme [IEEE IT ’96] are provably secure against concurrent attacks under the assumptions on the worstcase hardness of lattice problems. These assumptions are weaker than those for the existing schemes of Micciancio
Improving Additive and Multiplicative Homomorphic Encryption Schemes Based on WorstCase Hardness Assumptions
"... Abstract. In CRYPTO 2010, Aguilar et al. proposed a somewhat homomorphic encryption scheme, i.e. an encryption scheme allowing to compute a limited amount of sums and products over encrypted data, with a security reduction from LWE over general lattices. General lattices (as opposed to ideal lattice ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. In CRYPTO 2010, Aguilar et al. proposed a somewhat homomorphic encryption scheme, i.e. an encryption scheme allowing to compute a limited amount of sums and products over encrypted data, with a security reduction from LWE over general lattices. General lattices (as opposed to ideal lattices) do not have an inherent multiplicative structure but, using a tensorial product, Aguilar et al. managed to obtain a scheme allowing to compute products with a polylogarithmic amount of operands. In this paper we present an alternative construction allowing to compute products with polynomiallymany operands while preserving the security reductions of the initial scheme. Unfortunately, despite this improvement our construction seems to be incompatible with Gentry’s seminal transformation allowing to obtain fullyhomomorphic encryption schemes. Recently, Brakerski et al. used the tensorial product approach introduced by Aguilar et al. in a new alternative way which allows to radically improve the performance of the obtained scheme. Based on this approach, and using two nice optimizations, their scheme is able to evaluate products with exponentiallymany operands and can be transformed into an efficient fullyhomomorphic encryption scheme while being based on general lattice problems. However, even if these results outperform the construction presented here, we believe the modifications we suggest for Aguilar et al.’s schemes are of independent interest.
Worstcase equilibria
 IN PROCEEDINGS OF THE 16TH ANNUAL SYMPOSIUM ON THEORETICAL ASPECTS OF COMPUTER SCIENCE
, 1999
"... In a system in which noncooperative agents share a common resource, we propose the ratio between the worst possible Nash equilibrium and the social optimum as a measure of the effectiveness of the system. Deriving upper and lower bounds for this ratio in a model in which several agents share a ver ..."
Abstract

Cited by 851 (17 self)
 Add to MetaCart
In a system in which noncooperative agents share a common resource, we propose the ratio between the worst possible Nash equilibrium and the social optimum as a measure of the effectiveness of the system. Deriving upper and lower bounds for this ratio in a model in which several agents share a
Generalized compact knapsacks are collision resistant
 In ICALP (2
, 2006
"... n.A step in the direction of creating efficient cryptographic functions based on worstcase hardness was ..."
Abstract

Cited by 57 (15 self)
 Add to MetaCart
n.A step in the direction of creating efficient cryptographic functions based on worstcase hardness was
Publickey cryptosystems from the worstcase shortest vector problem
, 2008
"... We construct publickey cryptosystems that are secure assuming the worstcase hardness of approximating the length of a shortest nonzero vector in an ndimensional lattice to within a small poly(n) factor. Prior cryptosystems with worstcase connections were based either on the shortest vector probl ..."
Abstract

Cited by 153 (22 self)
 Add to MetaCart
We construct publickey cryptosystems that are secure assuming the worstcase hardness of approximating the length of a shortest nonzero vector in an ndimensional lattice to within a small poly(n) factor. Prior cryptosystems with worstcase connections were based either on the shortest vector
Tight bounds for worstcase equilibria
 Proc. 13th SODA
, 2002
"... We study the problem of traffic routing in noncooperative networks. In such networks, users may follow selfish strategies to optimize their own performance measure and therefore their behavior does not have to lead to optimal performance of the entire network. In this paper we investigate the worst ..."
Abstract

Cited by 186 (6 self)
 Add to MetaCart
the worstcase coordination ratio, which is a game theoretic measure aiming to reflect the price of selfish routing. Following a line of previous work, we focus on the most basic networks consisting of parallel links with linear latency functions. Our main result is that the worstcase coordination ratio
WorstCase to AverageCase Reductions Revisited
"... Abstract. A fundamental goal of computational complexity (and foundations of cryptography) is to find a polynomialtime samplable distribution (e.g., the uniform distribution) and a language in NTIME(f(n)) for some polynomial function f, such that the language is hard on the average with respect to ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
to this distribution, given that NP is worstcase hard (i.e. NP ̸ = P, or NP ̸ ⊆ BPP). Currently, no such result is known even if we relax the language to be in nondeterministic subexponential time. There has been a long line of research trying to explain our failure in proving such worstcase/averagecase
If NP languages are hard on the worstcase then it is easy to find their hard instances
 PROCEEDINGS OF THE 20TH ANNUAL CONFERENCE ON COMPUTATIONAL COMPLEXITY, (CCC)
, 2005
"... We prove that if NP 6t, BPP, i.e., if some NPcomplete language is worstcase hard, then for every probabilistic algorithm trying to decide the language,there exists some polynomially samplable distribution that is hard for it. That is, the algorithm often errson inputs from this distribution. This ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
We prove that if NP 6t, BPP, i.e., if some NPcomplete language is worstcase hard, then for every probabilistic algorithm trying to decide the language,there exists some polynomially samplable distribution that is hard for it. That is, the algorithm often errson inputs from this distribution
Where the REALLY Hard Problems Are
 IN J. MYLOPOULOS AND R. REITER (EDS.), PROCEEDINGS OF 12TH INTERNATIONAL JOINT CONFERENCE ON AI (IJCAI91),VOLUME 1
, 1991
"... It is well known that for many NPcomplete problems, such as KSat, etc., typical cases are easy to solve; so that computationally hard cases must be rare (assuming P != NP). This paper shows that NPcomplete problems can be summarized by at least one "order parameter", and that the hard p ..."
Abstract

Cited by 681 (1 self)
 Add to MetaCart
It is well known that for many NPcomplete problems, such as KSat, etc., typical cases are easy to solve; so that computationally hard cases must be rare (assuming P != NP). This paper shows that NPcomplete problems can be summarized by at least one "order parameter", and that the hard
Results 1  10
of
313,746