Results 1  10
of
123
The multiplicative weights update method: a meta algorithm and applications
, 2005
"... Algorithms in varied fields use the idea of maintaining a distribution over a certain set and use the multiplicative update rule to iteratively change these weights. Their analysis are usually very similar and rely on an exponential potential function. We present a simple meta algorithm that unifies ..."
Abstract

Cited by 147 (13 self)
 Add to MetaCart
(Show Context)
Algorithms in varied fields use the idea of maintaining a distribution over a certain set and use the multiplicative update rule to iteratively change these weights. Their analysis are usually very similar and rely on an exponential potential function. We present a simple meta algorithm that unifies these disparate algorithms and drives them as simple instantiations of the meta algorithm. 1
Pseudorandom generators without the XOR Lemma (Extended Abstract)
, 1998
"... Impagliazzo and Wigderson [IW97] have recently shown that if there exists a decision problem solvable in time 2 O(n) and having circuit complexity 2 n) (for all but finitely many n) then P = BPP. This result is a culmination of a series of works showing connections between the existence of har ..."
Abstract

Cited by 138 (23 self)
 Add to MetaCart
Impagliazzo and Wigderson [IW97] have recently shown that if there exists a decision problem solvable in time 2 O(n) and having circuit complexity 2 n) (for all but finitely many n) then P = BPP. This result is a culmination of a series of works showing connections between the existence of hard predicates and the existence of good pseudorandom generators. The construction of Impagliazzo and Wigderson goes through three phases of "hardness amplification" (a multivariate polynomial encoding, a first derandomized XOR Lemma, and a second derandomized XOR Lemma) that are composed with the Nisan Wigderson [NW94] generator. In this paper we present two different approaches to proving the main result of Impagliazzo and Wigderson. In developing each approach, we introduce new techniques and prove new results that could be useful in future improvements and/or applications of hardnessrandomness tradeoffs. Our first result is that when (a modified version of) the NisanWigderson generator construction is applied with a "mildly" hard predicate, the result is a generator that produces a distribution indistinguishable from having large minentropy. An extractor can then be used to produce a distribution computationally indistinguishable from uniform. This is the first construction of a pseudorandom generator that works with a mildly hard predicate without doing hardness amplification. We then show that in the ImpagliazzoWigderson construction only the first hardnessamplification phase (encoding with multivariate polynomial) is necessary, since it already gives the required averagecase hardness. We prove this result by (i) establishing a connection between the hardnessamplification problem and a listdecoding...
Simple Extractors for All MinEntropies and a New PseudoRandom Generator
"... We present a simple, selfcontained extractor construction that produces good extractors for all minentropies (minentropy measures the amount of randomness contained in a weak random source). Our construction is algebraic and builds on a new polynomialbased approach introduced by TaShma, Zuckerm ..."
Abstract

Cited by 111 (27 self)
 Add to MetaCart
We present a simple, selfcontained extractor construction that produces good extractors for all minentropies (minentropy measures the amount of randomness contained in a weak random source). Our construction is algebraic and builds on a new polynomialbased approach introduced by TaShma, Zuckerman, and Safra [37]. Using our improvements, we obtain, for example, an extractor with output length m = k1\Gamma ffi and seed length O(log n). This matches the parameters of Trevisan's breakthrough result [38] and additionally achieves those parameters for smallminentropies k. Extending [38] to small k has been the focus of a sequence of recent works [15, 26, 35]. Our construction gives a much simpler and more direct solution tothis problem. Applying similar ideas to the problem of building pseudorandom generators, we obtain a new pseudorandom generator construction that is not based on the NW generator[21], and turns worstcase hardness directly into pseudorandomness. The parameters of this generator match those in [16, 33] and in particular are strong enough to obtain a new proof that P = BP P if E requires exponential size circuits. Essentially the same construction yields a hitting set generator with optimal seed length that outputs s\Omega (1) bits when given a function that requires circuits of size s (for any s). This implies a hardness versus randomness tradeoff for RP and BP P that is optimal (up to polynomial factors), solving an open problem raised by [14]. Our generators can also be used to derandomize AM in a way that improves and extends the results of [4, 18, 20].
Graph Nonisomorphism Has Subexponential Size Proofs Unless The PolynomialTime Hierarchy Collapses
 SIAM Journal on Computing
, 1998
"... We establish hardness versus randomness tradeoffs for a broad class of randomized procedures. In particular, we create efficient nondeterministic simulations of bounded round ArthurMerlin games using a language in exponential time that cannot be decided by polynomial size oracle circuits with acce ..."
Abstract

Cited by 110 (4 self)
 Add to MetaCart
(Show Context)
We establish hardness versus randomness tradeoffs for a broad class of randomized procedures. In particular, we create efficient nondeterministic simulations of bounded round ArthurMerlin games using a language in exponential time that cannot be decided by polynomial size oracle circuits with access to satisfiability. We show that every language with a bounded round ArthurMerlin game has subexponential size membership proofs for infinitely many input lengths unless exponential time coincides with the third level of the polynomialtime hierarchy (and hence the polynomialtime hierarchy collapses). This provides the first strong evidence that graph nonisomorphism has subexponential size proofs. We set up a general framework for derandomization which encompasses more than the traditional model of randomized computation. For a randomized procedure to fit within this framework, we only require that for any fixed input the complexity of checking whether the procedure succeeds on a given ...
Extractors and Pseudorandom Generators
 Journal of the ACM
, 1999
"... We introduce a new approach to constructing extractors. Extractors are algorithms that transform a "weakly random" distribution into an almost uniform distribution. Explicit constructions of extractors have a variety of important applications, and tend to be very difficult to obtain. ..."
Abstract

Cited by 104 (6 self)
 Add to MetaCart
We introduce a new approach to constructing extractors. Extractors are algorithms that transform a "weakly random" distribution into an almost uniform distribution. Explicit constructions of extractors have a variety of important applications, and tend to be very difficult to obtain.
Proofs of retrievability via hardness amplification
 In TCC
, 2009
"... Proofs of Retrievability (PoR), introduced by Juels and Kaliski [JK07], allow the client to store a file F on an untrusted server, and later run an efficient audit protocol in which the server proves that it (still) possesses the client’s data. Constructions of PoR schemes attempt to minimize the cl ..."
Abstract

Cited by 84 (4 self)
 Add to MetaCart
Proofs of Retrievability (PoR), introduced by Juels and Kaliski [JK07], allow the client to store a file F on an untrusted server, and later run an efficient audit protocol in which the server proves that it (still) possesses the client’s data. Constructions of PoR schemes attempt to minimize the client and server storage, the communication complexity of an audit, and even the number of fileblocks accessed by the server during the audit. In this work, we identify several different variants of the problem (such as boundeduse vs. unboundeduse, knowledgesoundness vs. informationsoundness), and giving nearly optimal PoR schemes for each of these variants. Our constructions either improve (and generalize) the prior PoR constructions, or give the first known PoR schemes with the required properties. In particular, we • Formally prove the security of an (optimized) variant of the boundeduse scheme of Juels and Kaliski [JK07], without making any simplifying assumptions on the behavior of the adversary. • Build the first unboundeduse PoR scheme where the communication complexity is linear in the security parameter and which does not rely on Random Oracles, resolving an open question of Shacham and Waters [SW08]. • Build the first boundeduse scheme with informationtheoretic security. The main insight of our work comes from a simple connection between PoR schemes and the notion of hardness amplification, extensively studied in complexity theory. In particular, our improvements come from first abstracting a purely informationtheoretic notion of PoR codes, and then building nearly optimal PoR codes using stateoftheart tools from coding and complexity theory.
Magic Functions
, 1999
"... We consider three apparently unrelated fundamental problems in distributed computing, cryptography and complexity theory and prove that they are essentially the same problem. ..."
Abstract

Cited by 76 (1 self)
 Add to MetaCart
We consider three apparently unrelated fundamental problems in distributed computing, cryptography and complexity theory and prove that they are essentially the same problem.
Separating succinct noninteractive arguments from all falsifiable assumptions
 In Proceedings of the 43rd Annual ACM Symposium on Theory of Computing, STOC ’11
, 2011
"... An argument system (computationally sound proof) for N P is succinct, if its communication complexity is polylogarithmic the instance and witness sizes. The seminal works of Kilian ’92 and Micali ’94 show that such arguments can be constructed under standard cryptographic hardness assumptions with f ..."
Abstract

Cited by 75 (4 self)
 Add to MetaCart
An argument system (computationally sound proof) for N P is succinct, if its communication complexity is polylogarithmic the instance and witness sizes. The seminal works of Kilian ’92 and Micali ’94 show that such arguments can be constructed under standard cryptographic hardness assumptions with four rounds of interaction, and that they be made noninteractive in the randomoracle model. The latter construction also gives us some evidence that succinct noninteractive arguments (SNARGs) may exist in the standard model with a common reference string (CRS), by replacing the oracle with a sufficiently complicated hash function whose description goes in the CRS. However, we currently do not know of any construction of SNARGs with a proof of security under any simple cryptographic assumption. In this work, we give a broad blackbox separation result, showing that blackbox reductions cannot be used to prove the security of any SNARG construction based on any falsifiable cryptographic assumption. This includes essentially all common assumptions used in cryptography (oneway functions, trapdoor permutations, DDH, RSA, LWE etc.). More generally, we say that an assumption is falsifiable if it can be modeled as an interactive game between an adversary and an efficient challenger that can efficiently decide if the adversary won the game. This is similar, in spirit, to the notion of falsifiability of Naor ’03, and captures the fact that we can efficiently check if an adversarial strategy breaks the assumption. Our separation result also extends to designated verifier SNARGs, where the verifier needs a trapdoor associated with the CRS to verify arguments, and slightly succinct SNARGs, whose size is only required to be sublinear in the statement and witness size.
Randomness vs. Time: Derandomization under a uniform assumption
"... We prove that if BPP � = EXP, then every problem in BPP can be solved deterministically in subexponential time on almost every input ( on every samplable ensemble for infinitely many input sizes). This is the first derandomization result for BP P based on uniform, noncryptographic hardness assumptio ..."
Abstract

Cited by 72 (11 self)
 Add to MetaCart
We prove that if BPP � = EXP, then every problem in BPP can be solved deterministically in subexponential time on almost every input ( on every samplable ensemble for infinitely many input sizes). This is the first derandomization result for BP P based on uniform, noncryptographic hardness assumptions. It implies the following gap in the averageinstance complexities of problems in BP P: either these complexities are always subexponential or they contain arbitrarily large exponential functions. We use a construction of a small “pseudorandom” set of strings from a “hard function” in EXP which is identical to that used in the analogous nonuniform results of [21, 3]. However, previous proofs of correctness assume the “hard function ” is not in P/poly. They give a nonconstructive argument that a circuit distinguishing the pseudorandom strings from truly random strings implies that a similarlysized circuit exists computing the “hard function”. Our main technical contribution is to show that, if the “hard function ” has certain properties, then this argument can be made constructive. We then show that, assuming EXP ⊆ P/poly, there are EXPcomplete functions with these properties.