Results 1 - 10
of
29,973
Finding Meaning in Error Terms
, 2007
"... (In memory of Serge Lang) Four decades ago, Mikio Sato and John Tate predicted the shape of probability distributions to which certain “error terms ” in number theory conform. Their prediction—known as the Sato-Tate ..."
Abstract
-
Cited by 17 (1 self)
- Add to MetaCart
(In memory of Serge Lang) Four decades ago, Mikio Sato and John Tate predicted the shape of probability distributions to which certain “error terms ” in number theory conform. Their prediction—known as the Sato-Tate
Near Shannon limit error-correcting coding and decoding
, 1993
"... Abstract- This paper deals with a new class of convolutional codes called Turbo-codes, whose performances in terms of Bit Error Rate (BER) are close to the SHANNON limit. The Turbo-Code encoder is built using a parallel concatenation of two Recursive Systematic Convolutional codes and the associated ..."
Abstract
-
Cited by 1776 (6 self)
- Add to MetaCart
Abstract- This paper deals with a new class of convolutional codes called Turbo-codes, whose performances in terms of Bit Error Rate (BER) are close to the SHANNON limit. The Turbo-Code encoder is built using a parallel concatenation of two Recursive Systematic Convolutional codes
Good Error-Correcting Codes based on Very Sparse Matrices
, 1999
"... We study two families of error-correcting codes defined in terms of very sparse matrices. "MN" (MacKay--Neal) codes are recently invented, and "Gallager codes" were first investigated in 1962, but appear to have been largely forgotten, in spite of their excellent properties. The ..."
Abstract
-
Cited by 750 (23 self)
- Add to MetaCart
We study two families of error-correcting codes defined in terms of very sparse matrices. "MN" (MacKay--Neal) codes are recently invented, and "Gallager codes" were first investigated in 1962, but appear to have been largely forgotten, in spite of their excellent properties
Testing War in the Error Term
"... Researchers interested in the causes of war should note that the proof in “War Is in the Error Term ” contains a mistake+ Correcting the logic appears to reduce the advantage of large samples in testing rational explanations for war+ The proof, presented by Erik Gartzke in the Summer 1999 issue of I ..."
Abstract
- Add to MetaCart
Researchers interested in the causes of war should note that the proof in “War Is in the Error Term ” contains a mistake+ Correcting the logic appears to reduce the advantage of large samples in testing rational explanations for war+ The proof, presented by Erik Gartzke in the Summer 1999 issue
Model-Based Analysis of Oligonucleotide Arrays: Model Validation, Design Issues and Standard Error Application
, 2001
"... Background: A model-based analysis of oligonucleotide expression arrays we developed previously uses a probe-sensitivity index to capture the response characteristic of a specific probe pair and calculates model-based expression indexes (MBEI). MBEI has standard error attached to it as a measure of ..."
Abstract
-
Cited by 775 (28 self)
- Add to MetaCart
Background: A model-based analysis of oligonucleotide expression arrays we developed previously uses a probe-sensitivity index to capture the response characteristic of a specific probe pair and calculates model-based expression indexes (MBEI). MBEI has standard error attached to it as a measure
War Is in the Error Term
- International Organization
, 2003
"... The main theoretical task facing students of war is not to add to the already long list of arguments and conjectures but instead to take apart and reassemble these diverse arguments into a coherent theory t for guiding empirical research.1 At least since Thucydides, students of international relati ..."
Abstract
-
Cited by 19 (0 self)
- Add to MetaCart
The main theoretical task facing students of war is not to add to the already long list of arguments and conjectures but instead to take apart and reassemble these diverse arguments into a coherent theory t for guiding empirical research.1 At least since Thucydides, students of international relations have sought rational explanations for the advent of war. Rationalist explanations assume purposive ac-tion; states are said to make reasoned decisions about the use of force. Although rationalist explanations have proven persuasive, durable, and offer the basis for cu-mulative theorizing, they also imply substantial limits on what we can know about war. I show that the most general rationalist explanation for war also dictates that the onset of war is theoretically indeterminate. We cannot predict in individual cases whether states will go to war, because war is typically the consequence of variables that are unobservable ex ante, both to us as researchers and to the participants.2 Thinking probabilistically continues to offer the opportunity to assess international con ict empirically. However, the realization that uncertainty is necessary theoreti-cally to motivate war is much different from recognizing that the empirical world contains a stochastic element.Accepting uncertainty as a necessary condition of war implies that all other variables—however detailed the explanation—serve to elimi-nate gradations of irrelevant alternatives. We can progressively re ne our ability to distinguish states that may use force from those that are likely to remain at peace, but anticipating wars from a pool of states that appear willing to ght will remain prob-lematic. For example, we may achieve considerable success in anticipating crises, but our ability to predict which crises will become wars will probably prove little better than the naive predictions of random chance. The need for uncertainty to I am indebted to patient listeners, particularly my wife, Tara. Barbara Koremenos deserves special thanks for proposing the project’s form and outlet. I thankDavid H. Clark, John Conybeare, James Fearon,
Stable signal recovery from incomplete and inaccurate measurements,”
- Comm. Pure Appl. Math.,
, 2006
"... Abstract Suppose we wish to recover a vector x 0 ∈ R m (e.g., a digital signal or image) from incomplete and contaminated observations y = Ax 0 + e; A is an n × m matrix with far fewer rows than columns (n m) and e is an error term. Is it possible to recover x 0 accurately based on the data y? To r ..."
Abstract
-
Cited by 1397 (38 self)
- Add to MetaCart
Abstract Suppose we wish to recover a vector x 0 ∈ R m (e.g., a digital signal or image) from incomplete and contaminated observations y = Ax 0 + e; A is an n × m matrix with far fewer rows than columns (n m) and e is an error term. Is it possible to recover x 0 accurately based on the data y
Long Short-term Memory
, 1995
"... "Recurrent backprop" for learning to store information over extended time intervals takes too long. The main reason is insufficient, decaying error back flow. We briefly review Hochreiter's 1991 analysis of this problem. Then we overcome it by introducing a novel, efficient method c ..."
Abstract
-
Cited by 454 (58 self)
- Add to MetaCart
called "Long Short Term Memory" (LSTM). LSTM can learn to bridge minimal time lags in excess of 1000 time steps by enforcing constant error flow through internal states of special units. Multiplicative gate units learn to open and close access to constant error flow. LSTM's update
Bayesian Analysis of Stochastic Volatility Models
, 1994
"... this article is to develop new methods for inference and prediction in a simple class of stochastic volatility models in which logarithm of conditional volatility follows an autoregressive (AR) times series model. Unlike the autoregressive conditional heteroscedasticity (ARCH) and gener- alized ARCH ..."
Abstract
-
Cited by 601 (26 self)
- Add to MetaCart
ARCH (GARCH) models [see Bollerslev, Chou, and Kroner (1992) for a survey of ARCH modeling], both the mean and log-volatility equations have separate error terms. The ease of evaluating the ARCH likelihood function and the ability of the ARCH specification to accommodate the timevarying volatility
Results 1 - 10
of
29,973