Results 1  10
of
1,004,777
Efficient noisetolerant learning from statistical queries
 JOURNAL OF THE ACM
, 1998
"... In this paper, we study the problem of learning in the presence of classification noise in the probabilistic learning model of Valiant and its variants. In order to identify the class of “robust” learning algorithms in the most general way, we formalize a new but related model of learning from stat ..."
Abstract

Cited by 357 (5 self)
 Add to MetaCart
’s model and its variants can also be learned in the new model (and thus can be learned in the presence of noise). A notable exception to this statement is the class of parity functions, which we prove is not learnable from statistical queries, and for which no noisetolerant algorithm is known.
NoiseTolerant Windowing
 JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
, 1998
"... Windowing has been proposed as a procedure for efficient memory use in the ID3 decision tree learning algorithm. However, it was shown that it may often lead to a decrease in performance, in particular in noisy domains. Following up on previous work, where we have demonstrated that the ability of ru ..."
Abstract

Cited by 28 (4 self)
 Add to MetaCart
of rule learning algorithms to learn rules independently can be exploited for more efficient windowing procedures, we demonstrate in this paper how this property can be exploited to achieve noisetolerance in windowing.
Instancebased learning algorithms
 Machine Learning
, 1991
"... Abstract. Storing and using specific instances improves the performance of several supervised learning algorithms. These include algorithms that learn decision trees, classification rules, and distributed networks. However, no investigation has analyzed algorithms that use only specific instances to ..."
NoiseTolerant Conceptual Clustering*
"... Fisher (1987a,b) introduced a performance task for conceptual clustering: flexible prediction of arbitrary attribute values, not simply the prediction of a single 'class' attribute. This paper extends earlier analysis by considering the effects of noise and other environmental factors. The ..."
Abstract
 Add to MetaCart
. The degradation in flexible prediction accuracy that results from noise is mitigated by 'preferred ' prediction points for individual attributes. Methods that identify these prediction points are inspired by pruning in learning from examples. We extend these noisetolerant techniques to untutored
On the sample complexity of noisetolerant learning
 Information Processing Letters
, 1996
"... Abstract In this paper, we further characterize the complexity of noisetolerant learning in the PAC model. Specifically, we show a general lower bound of \Omega \Gamma log(1/ffi)"(12j)2 \Delta on the number of examples required for PAC learning in the presence of classification noise. Com ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
Abstract In this paper, we further characterize the complexity of noisetolerant learning in the PAC model. Specifically, we show a general lower bound of \Omega \Gamma log(1/ffi)"(12j)2 \Delta on the number of examples required for PAC learning in the presence of classification noise
Noisetolerant learning, the parity problem, and the statistical query model
 J. ACM
"... We describe a slightly subexponential time algorithm for learning parity functions in the presence of random classification noise. This results in a polynomialtime algorithm for the case of parity functions that depend on only the first O(log n log log n) bits of input. This is the first known ins ..."
Abstract

Cited by 164 (2 self)
 Add to MetaCart
instance of an efficient noisetolerant algorithm for a concept class that is provably not learnable in the Statistical Query model of Kearns [7]. Thus, we demonstrate that the set of problems learnable in the statistical query model is a strict subset of those problems learnable in the presence of noise
A learning algorithm for Boltzmann machines
 Cognitive Science
, 1985
"... The computotionol power of massively parallel networks of simple processing elements resides in the communication bandwidth provided by the hardware connections between elements. These connections con allow a significant fraction of the knowledge of the system to be applied to an instance of a probl ..."
Abstract

Cited by 586 (13 self)
 Add to MetaCart
to a general learning rule for modifying the connection strengths so as to incorporate knowledge obout o task domain in on efficient way. We describe some simple examples in which the learning algorithm creates internal representations thot ore demonstrobly the most efficient way of using
SemiSupervised Learning Literature Survey
, 2006
"... We review the literature on semisupervised learning, which is an area in machine learning and more generally, artificial intelligence. There has been a whole
spectrum of interesting ideas on how to learn from both labeled and unlabeled data, i.e. semisupervised learning. This document is a chapter ..."
Abstract

Cited by 757 (8 self)
 Add to MetaCart
We review the literature on semisupervised learning, which is an area in machine learning and more generally, artificial intelligence. There has been a whole
spectrum of interesting ideas on how to learn from both labeled and unlabeled data, i.e. semisupervised learning. This document is a
On the impossibility of informationally efficient markets
 AMERICAN ECONOMIC REVIEW
, 1980
"... ..."
Active Learning with Statistical Models
, 1995
"... For manytypes of learners one can compute the statistically "optimal" way to select data. We review how these techniques have been used with feedforward neural networks [MacKay, 1992# Cohn, 1994]. We then showhow the same principles may be used to select data for two alternative, statist ..."
Abstract

Cited by 677 (12 self)
 Add to MetaCart
, statisticallybased learning architectures: mixtures of Gaussians and locally weighted regression. While the techniques for neural networks are expensive and approximate, the techniques for mixtures of Gaussians and locally weighted regression are both efficient and accurate.
Results 1  10
of
1,004,777