Results 1  10
of
6,158
Byzantine disk paxos: optimal resilience with Byzantine shared memory
 Distributed Computing
, 2006
"... We present Byzantine Disk Paxos, an asynchronous sharedmemory consensus protocol that uses a collection of n> 3t disks, t of which may fail by becoming nonresponsive or arbitrarily corrupted. We give two constructions of this protocol; that is, we construct two different building blocks, each of ..."
Abstract

Cited by 48 (3 self)
 Add to MetaCart
, t of which can be nonresponsive or Byzantine. All the previous waitfree constructions in this model used at least 4t + 1 faultprone registers, and we are not familiar with any prior FWterminating constructions in this model. Categories and Subject Descriptors B.3.2 [Memory Structures]: Design Styles
Evaluating the Accuracy of SamplingBased Approaches to the Calculation of Posterior Moments
 IN BAYESIAN STATISTICS
, 1992
"... Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accurac ..."
Abstract

Cited by 604 (12 self)
 Add to MetaCart
accuracy of the approximations to the expected value of functions of interest under the posterior. In this paper methods from spectral analysis are used to evaluate numerical accuracy formally and construct diagnostics for convergence. These methods are illustrated in the normal linear model
Fuzzy extractors: How to generate strong keys from biometrics and other noisy data
, 2008
"... We provide formal definitions and efficient secure techniques for • turning noisy information into keys usable for any cryptographic application, and, in particular, • reliably and securely authenticating biometric data. Our techniques apply not just to biometric information, but to any keying mater ..."
Abstract

Cited by 535 (38 self)
 Add to MetaCart
, it can be used to reliably reproduce errorprone biometric inputs without incurring the security risk inherent in storing them. We define the primitives to be both formally secure and versatile, generalizing much prior work. In addition, we provide nearly optimal constructions of both primitives
Lambertian Reflectance and Linear Subspaces
, 2000
"... We prove that the set of all reflectance functions (the mapping from surface normals to intensities) produced by Lambertian objects under distant, isotropic lighting lies close to a 9D linear subspace. This implies that, in general, the set of images of a convex Lambertian object obtained under a wi ..."
Abstract

Cited by 526 (20 self)
 Add to MetaCart
wide variety of lighting conditions can be approximated accurately by a lowdimensional linear subspace, explaining prior empirical results. We also provide a simple analytic characterization of this linear space. We obtain these results by representing lighting using spherical harmonics and describing
Prior distributions for variance parameters in hierarchical models
 Bayesian Analysis
, 2006
"... Various noninformative prior distributions have been suggested for scale parameters in hierarchical models. We construct a new foldednoncentralt family of conditionally conjugate priors for hierarchical standard deviation parameters, and then consider noninformative and weakly informative priors i ..."
Abstract

Cited by 430 (15 self)
 Add to MetaCart
Various noninformative prior distributions have been suggested for scale parameters in hierarchical models. We construct a new foldednoncentralt family of conditionally conjugate priors for hierarchical standard deviation parameters, and then consider noninformative and weakly informative priors
SMOTE: Synthetic Minority Oversampling Technique
 Journal of Artificial Intelligence Research
, 2002
"... An approach to the construction of classifiers from imbalanced datasets is described. A dataset is imbalanced if the classification categories are not approximately equally represented. Often realworld data sets are predominately composed of ``normal'' examples with only a small percentag ..."
Abstract

Cited by 634 (27 self)
 Add to MetaCart
An approach to the construction of classifiers from imbalanced datasets is described. A dataset is imbalanced if the classification categories are not approximately equally represented. Often realworld data sets are predominately composed of ``normal'' examples with only a small
Relative absorptive capacity and interorganizational learning
 STRATEGIC MANAGEMENT JOURNAL
, 1998
"... Much of the prior research on interorganizational learning has focused on the role of absorptive capacity, a firm’s ability to value, assimilate, and utilize new external knowledge. However, this definition of the construct suggests that a firm has an equal capacity to learn from all other organizat ..."
Abstract

Cited by 463 (2 self)
 Add to MetaCart
Much of the prior research on interorganizational learning has focused on the role of absorptive capacity, a firm’s ability to value, assimilate, and utilize new external knowledge. However, this definition of the construct suggests that a firm has an equal capacity to learn from all other
Data Compression Using Adaptive Coding and Partial String Matching
 IEEE TRANSACTIONS ON COMMUNICATIONS
, 1984
"... The recently developed technique of arithmetic coding, in conjunction with a Markov model of the source, is a powerful method of data compression in situations where a linear treatment is inappropriate. Adaptive coding allows the model to be constructed dynamically by both encoder and decoder during ..."
Abstract

Cited by 442 (20 self)
 Add to MetaCart
The recently developed technique of arithmetic coding, in conjunction with a Markov model of the source, is a powerful method of data compression in situations where a linear treatment is inappropriate. Adaptive coding allows the model to be constructed dynamically by both encoder and decoder
Distributed Source Coding Using Syndromes (DISCUS): Design and Construction
 IEEE TRANS. INFORM. THEORY
, 1999
"... We address the problem of distributed source coding, i.e. compression of correlated sources that are not colocated and/or cannot communicate with each other to minimize their joint description cost. In this work we tackle the related problem of compressing a source that is correlated with anothe ..."
Abstract

Cited by 407 (9 self)
 Add to MetaCart
with another source which is however available only at the decoder. In contrast to prior informationtheoretic approaches, we introduce a new constructive and practical framework for tackling the problem based on the judicious incorporation of channel coding principles into this source coding problem. We
Principal Curves
, 1989
"... Principal curves are smooth onedimensional curves that pass through the middle of a pdimensional data set, providing a nonlinear summary of the data. They are nonparametric, and their shape is suggested by the data. The algorithm for constructing principal curve starts with some prior summary, suc ..."
Abstract

Cited by 394 (1 self)
 Add to MetaCart
Principal curves are smooth onedimensional curves that pass through the middle of a pdimensional data set, providing a nonlinear summary of the data. They are nonparametric, and their shape is suggested by the data. The algorithm for constructing principal curve starts with some prior summary
Results 1  10
of
6,158