Results 1  10
of
29,973
Finding Meaning in Error Terms
, 2007
"... (In memory of Serge Lang) Four decades ago, Mikio Sato and John Tate predicted the shape of probability distributions to which certain “error terms ” in number theory conform. Their prediction—known as the SatoTate ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
(In memory of Serge Lang) Four decades ago, Mikio Sato and John Tate predicted the shape of probability distributions to which certain “error terms ” in number theory conform. Their prediction—known as the SatoTate
Near Shannon limit errorcorrecting coding and decoding
, 1993
"... Abstract This paper deals with a new class of convolutional codes called Turbocodes, whose performances in terms of Bit Error Rate (BER) are close to the SHANNON limit. The TurboCode encoder is built using a parallel concatenation of two Recursive Systematic Convolutional codes and the associated ..."
Abstract

Cited by 1776 (6 self)
 Add to MetaCart
Abstract This paper deals with a new class of convolutional codes called Turbocodes, whose performances in terms of Bit Error Rate (BER) are close to the SHANNON limit. The TurboCode encoder is built using a parallel concatenation of two Recursive Systematic Convolutional codes
Good ErrorCorrecting Codes based on Very Sparse Matrices
, 1999
"... We study two families of errorcorrecting codes defined in terms of very sparse matrices. "MN" (MacKayNeal) codes are recently invented, and "Gallager codes" were first investigated in 1962, but appear to have been largely forgotten, in spite of their excellent properties. The ..."
Abstract

Cited by 750 (23 self)
 Add to MetaCart
We study two families of errorcorrecting codes defined in terms of very sparse matrices. "MN" (MacKayNeal) codes are recently invented, and "Gallager codes" were first investigated in 1962, but appear to have been largely forgotten, in spite of their excellent properties
Testing War in the Error Term
"... Researchers interested in the causes of war should note that the proof in “War Is in the Error Term ” contains a mistake+ Correcting the logic appears to reduce the advantage of large samples in testing rational explanations for war+ The proof, presented by Erik Gartzke in the Summer 1999 issue of I ..."
Abstract
 Add to MetaCart
Researchers interested in the causes of war should note that the proof in “War Is in the Error Term ” contains a mistake+ Correcting the logic appears to reduce the advantage of large samples in testing rational explanations for war+ The proof, presented by Erik Gartzke in the Summer 1999 issue
ModelBased Analysis of Oligonucleotide Arrays: Model Validation, Design Issues and Standard Error Application
, 2001
"... Background: A modelbased analysis of oligonucleotide expression arrays we developed previously uses a probesensitivity index to capture the response characteristic of a specific probe pair and calculates modelbased expression indexes (MBEI). MBEI has standard error attached to it as a measure of ..."
Abstract

Cited by 775 (28 self)
 Add to MetaCart
Background: A modelbased analysis of oligonucleotide expression arrays we developed previously uses a probesensitivity index to capture the response characteristic of a specific probe pair and calculates modelbased expression indexes (MBEI). MBEI has standard error attached to it as a measure
War Is in the Error Term
 International Organization
, 2003
"... The main theoretical task facing students of war is not to add to the already long list of arguments and conjectures but instead to take apart and reassemble these diverse arguments into a coherent theory t for guiding empirical research.1 At least since Thucydides, students of international relati ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
The main theoretical task facing students of war is not to add to the already long list of arguments and conjectures but instead to take apart and reassemble these diverse arguments into a coherent theory t for guiding empirical research.1 At least since Thucydides, students of international relations have sought rational explanations for the advent of war. Rationalist explanations assume purposive action; states are said to make reasoned decisions about the use of force. Although rationalist explanations have proven persuasive, durable, and offer the basis for cumulative theorizing, they also imply substantial limits on what we can know about war. I show that the most general rationalist explanation for war also dictates that the onset of war is theoretically indeterminate. We cannot predict in individual cases whether states will go to war, because war is typically the consequence of variables that are unobservable ex ante, both to us as researchers and to the participants.2 Thinking probabilistically continues to offer the opportunity to assess international con ict empirically. However, the realization that uncertainty is necessary theoretically to motivate war is much different from recognizing that the empirical world contains a stochastic element.Accepting uncertainty as a necessary condition of war implies that all other variables—however detailed the explanation—serve to eliminate gradations of irrelevant alternatives. We can progressively re ne our ability to distinguish states that may use force from those that are likely to remain at peace, but anticipating wars from a pool of states that appear willing to ght will remain problematic. For example, we may achieve considerable success in anticipating crises, but our ability to predict which crises will become wars will probably prove little better than the naive predictions of random chance. The need for uncertainty to I am indebted to patient listeners, particularly my wife, Tara. Barbara Koremenos deserves special thanks for proposing the project’s form and outlet. I thankDavid H. Clark, John Conybeare, James Fearon,
Stable signal recovery from incomplete and inaccurate measurements,”
 Comm. Pure Appl. Math.,
, 2006
"... Abstract Suppose we wish to recover a vector x 0 ∈ R m (e.g., a digital signal or image) from incomplete and contaminated observations y = Ax 0 + e; A is an n × m matrix with far fewer rows than columns (n m) and e is an error term. Is it possible to recover x 0 accurately based on the data y? To r ..."
Abstract

Cited by 1397 (38 self)
 Add to MetaCart
Abstract Suppose we wish to recover a vector x 0 ∈ R m (e.g., a digital signal or image) from incomplete and contaminated observations y = Ax 0 + e; A is an n × m matrix with far fewer rows than columns (n m) and e is an error term. Is it possible to recover x 0 accurately based on the data y
Long Shortterm Memory
, 1995
"... "Recurrent backprop" for learning to store information over extended time intervals takes too long. The main reason is insufficient, decaying error back flow. We briefly review Hochreiter's 1991 analysis of this problem. Then we overcome it by introducing a novel, efficient method c ..."
Abstract

Cited by 454 (58 self)
 Add to MetaCart
called "Long Short Term Memory" (LSTM). LSTM can learn to bridge minimal time lags in excess of 1000 time steps by enforcing constant error flow through internal states of special units. Multiplicative gate units learn to open and close access to constant error flow. LSTM's update
Bayesian Analysis of Stochastic Volatility Models
, 1994
"... this article is to develop new methods for inference and prediction in a simple class of stochastic volatility models in which logarithm of conditional volatility follows an autoregressive (AR) times series model. Unlike the autoregressive conditional heteroscedasticity (ARCH) and gener alized ARCH ..."
Abstract

Cited by 601 (26 self)
 Add to MetaCart
ARCH (GARCH) models [see Bollerslev, Chou, and Kroner (1992) for a survey of ARCH modeling], both the mean and logvolatility equations have separate error terms. The ease of evaluating the ARCH likelihood function and the ability of the ARCH specification to accommodate the timevarying volatility
Results 1  10
of
29,973