Results 1 
2 of
2
The Bit Extraction Problem or tResilient Functions
, 1985
"... \Gamma We consider the following adversarial situation. Let n, m and t be arbitrary integers, and let f : f0; 1g n 7! f0; 1g m be a function. An adversary, knowing the function f , sets t of the n input bits, while the rest (n \Gamma t input bits) are chosen at random (independently and with un ..."
Abstract

Cited by 171 (11 self)
 Add to MetaCart
\Gamma We consider the following adversarial situation. Let n, m and t be arbitrary integers, and let f : f0; 1g n 7! f0; 1g m be a function. An adversary, knowing the function f , sets t of the n input bits, while the rest (n \Gamma t input bits) are chosen at random (independently and with uniform probability distribution). The adversary tries to prevent the outcome of f from being uniformly distributed in f0; 1g m . The question addressed is for what values of n, m and t does the adversary necessarily fail in biasing the outcome of f : f0; 1g n 7! f0; 1g m , when being restricted to set t of the input bits of f . We present various lower and upper bounds on m's allowing an affirmative answer. These bounds are relatively close for t n=3 and for t 2n=3. Our results have applications in the fields of faulttolerance and cryptography. 1. INTRODUCTION The bit extraction problem formulated above The bit extraction problem was suggested by Brassard and Robert [BRref] and by V...
On The Power Of TwoPoints Based Sampling
 Journal of Complexity
, 1989
"... The purpose of this note is to present a new sampling technique and to demonstrate some of its properties. The new technique consists of picking two elements at random, and deterministically generating (from them) a long sequence of pairwise independent elements. The sequence is guarantees to inters ..."
Abstract

Cited by 98 (17 self)
 Add to MetaCart
(Show Context)
The purpose of this note is to present a new sampling technique and to demonstrate some of its properties. The new technique consists of picking two elements at random, and deterministically generating (from them) a long sequence of pairwise independent elements. The sequence is guarantees to intersect, with high probability, any set of nonnegligible density. 1. Introduction In recent years the role of randomness in computation has become more and more dominant. Randomness was used to speed up sequential computations (e.g. primality testing, testing polynomial identities etc.), but its effect on parallel and distributed computation is even more impressive. In either cases the solutions are typically presented such that they are guarateed to produce the desired result with some nonnegligible probability. It is implicitly suggested that if a higher degree of confidence is required the algorithm should be run several times, each time using different coin tosses. Since the coin tosses f...