Results 1  10
of
205,317
The SmallWorld Phenomenon: An Algorithmic Perspective
 in Proceedings of the 32nd ACM Symposium on Theory of Computing
, 2000
"... Long a matter of folklore, the “smallworld phenomenon ” — the principle that we are all linked by short chains of acquaintances — was inaugurated as an area of experimental study in the social sciences through the pioneering work of Stanley Milgram in the 1960’s. This work was among the first to m ..."
Abstract

Cited by 824 (5 self)
 Add to MetaCart
that no decentralized algorithm, operating with local information only, can construct short paths in these networks with nonnegligible probability. We then define an infinite family of network models that naturally generalizes the WattsStrogatz model, and show that for one of these models, there is a decentralized
Boosting combinatorial search through randomization
, 1998
"... Unpredictability in the running time of complete search procedures can often be explained by the phenomenon of “heavytailed cost distributions”, meaning that at any time during the experiment there is a nonnegligible probability of hitting a problem that requires exponentially more time to solve t ..."
Abstract

Cited by 361 (35 self)
 Add to MetaCart
Unpredictability in the running time of complete search procedures can often be explained by the phenomenon of “heavytailed cost distributions”, meaning that at any time during the experiment there is a nonnegligible probability of hitting a problem that requires exponentially more time to solve
Loopy belief propagation for approximate inference: An empirical study. In:
 Proceedings of Uncertainty in AI,
, 1999
"... Abstract Recently, researchers have demonstrated that "loopy belief propagation" the use of Pearl's polytree algorithm in a Bayesian network with loops can perform well in the context of errorcorrecting codes. The most dramatic instance of this is the near Shannonlimit performanc ..."
Abstract

Cited by 676 (15 self)
 Add to MetaCart
marginals at the last two iterations. We only plot the diseases which had nonnegligible posterior probability. Loopy Belief Propagation . s=o� . a' range of prior To test this hypothesis, we reparameterized the pyra mid network as follows: we set the prior probability of the "1"
Weaknesses in the Key Scheduling Algorithm of RC4
 PROCEEDINGS OF THE 4TH ANNUAL WORKSHOP ON SELECTED AREAS OF CRYPTOGRAPHY
, 2001
"... In this paper we present several weaknesses in the key scheduling algorithm of RC4, and describe their cryptanalytic significance. We identify a large number of weak keys, in which knowledge of a small number of key bits suffices to determine many state and output bits with nonnegligible probabilit ..."
Abstract

Cited by 270 (1 self)
 Add to MetaCart
In this paper we present several weaknesses in the key scheduling algorithm of RC4, and describe their cryptanalytic significance. We identify a large number of weak keys, in which knowledge of a small number of key bits suffices to determine many state and output bits with nonnegligible
The Determinants of Credit Spread Changes.
 Journal of Finance
, 2001
"... ABSTRACT Using dealer's quotes and transactions prices on straight industrial bonds, we investigate the determinants of credit spread changes. Variables that should in theory determine credit spread changes have rather limited explanatory power. Further, the residuals from this regression are ..."
Abstract

Cited by 422 (2 self)
 Add to MetaCart
regression beyond the region where actual option prices are most typically observed. Note that if there is a nonnegligible probability of large negative jumps in firm value, then the appropriate hedging tool for corporate debt may not be the firm's equity, but rather deep outofthemoney puts
HigherOrder Cryptanalysis of Block Ciphers
, 1999
"... The theme in this thesis is design and analysis of block ciphers. Specifically, new attacks are described that successfully break cryptosystems in which the ciphertext is expressible as evaluations of some lowdegree polynomial in the plaintext with a low but nonnegligible probability. The attacks ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
The theme in this thesis is design and analysis of block ciphers. Specifically, new attacks are described that successfully break cryptosystems in which the ciphertext is expressible as evaluations of some lowdegree polynomial in the plaintext with a low but nonnegligible probability. The attacks
Elections Can be Manipulated Often
"... The GibbardSatterthwaite theorem states that every nontrivial voting method between at least 3 alternatives can be strategically manipulated. We prove a quantitative version of the GibbardSatterthwaite theorem: a random manipulation by a single random voter will succeed with nonnegligible probab ..."
Abstract

Cited by 66 (1 self)
 Add to MetaCart
The GibbardSatterthwaite theorem states that every nontrivial voting method between at least 3 alternatives can be strategically manipulated. We prove a quantitative version of the GibbardSatterthwaite theorem: a random manipulation by a single random voter will succeed with nonnegligible
Tracing Traitors
, 1994
"... We give cryptographic schemes that help trace the source of leaks when sensitive or proprietary data is made available to a large set of parties. A very relevant application is in the context of pay television, where only paying customers should be able to view certain programs. In this application ..."
Abstract

Cited by 172 (10 self)
 Add to MetaCart
probability. Since there is typically little demand for decoders which decrypt only a small fraction of the transmissions (even if it is nonnegligible), we further introduce threshold tracing schemes which can only be used against decoders which succeed in decryption with probability greater than some
On The Power Of TwoPoints Based Sampling
 Journal of Complexity
, 1989
"... The purpose of this note is to present a new sampling technique and to demonstrate some of its properties. The new technique consists of picking two elements at random, and deterministically generating (from them) a long sequence of pairwise independent elements. The sequence is guarantees to inters ..."
Abstract

Cited by 98 (17 self)
 Add to MetaCart
to intersect, with high probability, any set of nonnegligible density. 1. Introduction In recent years the role of randomness in computation has become more and more dominant. Randomness was used to speed up sequential computations (e.g. primality testing, testing polynomial identities etc.), but its effect
Modeling of Internet Traffic Data
"... Abstract This paper studies the progress made to date on the modeling of internet traffic. It more specifically looks at the repeatedly reported occurrence of extreme value behavior in aggregate internet traffic. A random variable that follows a heavytailed distribution can give rise to extremel ..."
Abstract
 Add to MetaCart
to extremely large values with nonnegligible probability. The use of an appropriate extreme value traffic model could be useful in avoiding overcrowding on network lines due to extreme behavior. Index terms extreme values, internet traffic, longrange dependence, selfsimilar..
Results 1  10
of
205,317