Results 1  10
of
431,220
A New Statistical Parser Based on Bigram Lexical Dependencies
, 1996
"... This paper describes a new statistical parser which is based on probabilities of dependencies between headwords in the parse tree. Standard bigram probability estimation techniques are extended to calculate probabilities of dependencies between pairs of words. Tests using Wall Street Journal ..."
Abstract

Cited by 491 (4 self)
 Add to MetaCart
strategy parsing speed can be improved to over 200 sentences a minute with negligible loss in accuracy.
The performance of TCP/IP for networks with high bandwidthdelay products and random loss
, 1997
"... This paper examines the performance of TCP/IP, the Internet data transport protocol, over Wide Area Networks (WANs) in which data traffic could coexist with realtime traffic such as voice and video. Specifically, we attempt to develop a basic understanding, using analysis and simulation, of the pro ..."
Abstract

Cited by 464 (6 self)
 Add to MetaCart
, of the properties of TCP/IP in a regime where (1) the bandwidthdelay product of the network is high compared to the buffering in the network, and (2) there may be transient congestion due to fluctuations in realtime traffic, modeled here as producing random losses among the packets of the TCP connection
The Nash Bargaining Solution in Economic Modeling
 Rand Journal of Economics
, 1986
"... This article establishes the relationship between the static axiomatic theory of bargaining and the sequential strategic approach to bargaining. We consider two strategic models of alternating offers. The models differ in the source of the incentive of the bargaining parties to reach agreement: the ..."
Abstract

Cited by 556 (1 self)
 Add to MetaCart
: the bargainers ' time preference and the risk of breakdown of negotiation. Each of the models has a unique perfect equilibrium. When the motivation to reach agreement is made negligible, in each model the unique perfect equilibrium outcome approaches the Nash bargaining solution, with utilities that reflect
The Macroscopic Behavior of the TCP Congestion Avoidance Algorithm
, 1997
"... In this paper, we analyze a performance model for the TCP Congestion Avoidance algorithm. The model predicts the bandwidth of a sustained TCP connection subjected to light to moderate packet losses, such as loss caused by network congestion. It assumes that TCP avoids retransmission timeouts and alw ..."
Abstract

Cited by 648 (18 self)
 Add to MetaCart
In this paper, we analyze a performance model for the TCP Congestion Avoidance algorithm. The model predicts the bandwidth of a sustained TCP connection subjected to light to moderate packet losses, such as loss caused by network congestion. It assumes that TCP avoids retransmission timeouts
For Most Large Underdetermined Systems of Linear Equations the Minimal ℓ1norm Solution is also the Sparsest Solution
 Comm. Pure Appl. Math
, 2004
"... We consider linear equations y = Φα where y is a given vector in R n, Φ is a given n by m matrix with n < m ≤ An, and we wish to solve for α ∈ R m. We suppose that the columns of Φ are normalized to unit ℓ 2 norm 1 and we place uniform measure on such Φ. We prove the existence of ρ = ρ(A) so that ..."
Abstract

Cited by 560 (10 self)
 Add to MetaCart
that for large n, and for all Φ’s except a negligible fraction, the following property holds: For every y having a representation y = Φα0 by a coefficient vector α0 ∈ R m with fewer than ρ · n nonzeros, the solution α1 of the ℓ 1 minimization problem min �x�1 subject to Φα = y is unique and equal to α0
LucasKanade 20 Years On: A Unifying Framework: Part 3
 International Journal of Computer Vision
, 2002
"... Since the LucasKanade algorithm was proposed in 1981 image alignment has become one of the most widely used techniques in computer vision. Applications range from optical flow, tracking, and layered motion, to mosaic construction, medical image registration, and face coding. Numerous algorithms hav ..."
Abstract

Cited by 698 (30 self)
 Add to MetaCart
examine which of the extensions to the LucasKanade algorithm can be used with the inverse compositional algorithm without any significant loss of efficiency, and which cannot. In this paper, Part 3 in a series of papers, we cover the extension of image alignment to allow linear appearance variation. We
Analysis, Modeling and Generation of SelfSimilar VBR Video Traffic
, 1994
"... We present a detailed statistical analysis of a 2hour long empirical sample of VBR video. The sample was obtained by applying a simple intraframe video compression code to an action movie. The main findings of our analysis are (1) the tail behavior of the marginal bandwidth distribution can be accu ..."
Abstract

Cited by 546 (6 self)
 Add to MetaCart
We present a detailed statistical analysis of a 2hour long empirical sample of VBR video. The sample was obtained by applying a simple intraframe video compression code to an action movie. The main findings of our analysis are (1) the tail behavior of the marginal bandwidth distribution can be accurately described using "heavytailed" distributions (e.g., Pareto); (2) the autocorrelation of the VBR video sequence decays hyperbolically (equivalent to longrange dependence) and can be modeled using selfsimilar processes. We combine our findings in a new (nonMarkovian) source model for VBR video and present an algorithm for generating synthetic traffic. Tracedriven simulations show that statistical multiplexing results in significant bandwidth efficiency even when longrange dependence is present. Simulations of our source model show longrange dependence and heavytailed marginals to be important components which are not accounted for in currently used VBR video traffic models. 1 I...
Modeling Term Structures of Defaultable Bonds
, 1999
"... This article presents convenient reducedform models of the valuation of contingent claims subject to default risk, focusing on applications to the term structure of interest rates for corporate or sovereign bonds. Examples include the valuation of a creditspread option ..."
Abstract

Cited by 651 (34 self)
 Add to MetaCart
This article presents convenient reducedform models of the valuation of contingent claims subject to default risk, focusing on applications to the term structure of interest rates for corporate or sovereign bonds. Examples include the valuation of a creditspread option
Fuzzy extractors: How to generate strong keys from biometrics and other noisy data. Technical Report 2003/235, Cryptology ePrint archive, http://eprint.iacr.org, 2006. Previous version appeared at EUROCRYPT 2004
 34 [DRS07] [DS05] [EHMS00] [FJ01] Yevgeniy Dodis, Leonid Reyzin, and Adam
, 2004
"... We provide formal definitions and efficient secure techniques for • turning noisy information into keys usable for any cryptographic application, and, in particular, • reliably and securely authenticating biometric data. Our techniques apply not just to biometric information, but to any keying mater ..."
Abstract

Cited by 532 (38 self)
 Add to MetaCart
We provide formal definitions and efficient secure techniques for • turning noisy information into keys usable for any cryptographic application, and, in particular, • reliably and securely authenticating biometric data. Our techniques apply not just to biometric information, but to any keying material that, unlike traditional cryptographic keys, is (1) not reproducible precisely and (2) not distributed uniformly. We propose two primitives: a fuzzy extractor reliably extracts nearly uniform randomness R from its input; the extraction is errortolerant in the sense that R will be the same even if the input changes, as long as it remains reasonably close to the original. Thus, R can be used as a key in a cryptographic application. A secure sketch produces public information about its input w that does not reveal w, and yet allows exact recovery of w given another value that is close to w. Thus, it can be used to reliably reproduce errorprone biometric inputs without incurring the security risk inherent in storing them. We define the primitives to be both formally secure and versatile, generalizing much prior work. In addition, we provide nearly optimal constructions of both primitives for various measures of “closeness” of input data, such as Hamming distance, edit distance, and set difference.
Adaptive Protocols for Information Dissemination in Wireless Sensor Networks
, 1999
"... In this paper, we present a family of adaptive protocols, called SPIN (Sensor Protocols for Information via Negotiation) , that eciently disseminates information among sensors in an energyconstrained wireless sensor network. Nodes running a SPIN communication protocol name their data using highlev ..."
Abstract

Cited by 662 (10 self)
 Add to MetaCart
In this paper, we present a family of adaptive protocols, called SPIN (Sensor Protocols for Information via Negotiation) , that eciently disseminates information among sensors in an energyconstrained wireless sensor network. Nodes running a SPIN communication protocol name their data using highlevel data descriptors, called metadata. They use metadata negotiations to eliminate the transmission of redundant data throughout the network. In addition, SPIN nodes can base their communication decisions both upon applicationspecic knowledge of the data and upon knowledge of the resources that are available to them. This allows the sensors to eciently distribute data given a limited energy supply. We simulate and analyze the performance of two specic SPIN protocols, comparing them to other possible approaches and a theoretically optimal protocol. We nd that the SPIN protocols can deliver 60% more data for a given amount of energy than conventional approaches. We also nd that, in terms...
Results 1  10
of
431,220