• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 658
Next 10 →

Online Ensemble Learning: An Empirical Study

by Alan Fern, Robert Givan - In Proceedings of the Seventeenth International Conference on Machine Learning , 2000
"... We study resource-limited online learning, motivated by the problem of conditional-branch outcome prediction in computer architecture. In particular, we consider (parallel) time and space-efficient ensemble learners for online settings, empirically demonstrating benefits similar to those shown pre ..."
Abstract - Cited by 32 (1 self) - Add to MetaCart
We study resource-limited online learning, motivated by the problem of conditional-branch outcome prediction in computer architecture. In particular, we consider (parallel) time and space-efficient ensemble learners for online settings, empirically demonstrating benefits similar to those shown

Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm

by Nick Littlestone - Machine Learning , 1988
"... learning Boolean functions, linear-threshold algorithms Abstract. Valiant (1984) and others have studied the problem of learning various classes of Boolean functions from examples. Here we discuss incremental learning of these functions. We consider a setting in which the learner responds to each ex ..."
Abstract - Cited by 773 (5 self) - Add to MetaCart
be expressed as a linear-threshold algorithm. A primary advantage of this algorithm is that the number of mistakes grows only logarithmically with the number of irrelevant attributes in the examples. At the same time, the algorithm is computationally efficient in both time and space. 1.

When Networks Disagree: Ensemble Methods for Hybrid Neural Networks

by Michael P. Perrone, Leaon N. Cooper , 1993
"... This paper presents a general theoretical framework for ensemble methods of constructing significantly improved regression estimates. Given a population of regression estimators, we construct a hybrid estimator which is as good or better in the MSE sense than any estimator in the population. We argu ..."
Abstract - Cited by 349 (3 self) - Add to MetaCart
argue that the ensemble method presented has several properties: 1) It efficiently uses all the networks of a population - none of the networks need be discarded. 2) It efficiently uses all the available data for training without over-fitting. 3) It inherently performs regularization by smoothing

Ensemble forecasting at NCEP and the breeding method

by Zoltan Toth, Eugenia Kalnay - Mon. Wea. Rev , 1997
"... The breeding method has been used to generate perturbations for ensemble forecasting at the National Centers for Environmental Prediction (formerly known as the National Meteorological Center) since December 1992. At that time a single breeding cycle with a pair of bred forecasts was implemented. In ..."
Abstract - Cited by 196 (15 self) - Add to MetaCart
. In March 1994, the ensemble was expanded to seven independent breeding cycles on the Cray C90 supercomputer, and the forecasts were extended to 16 days. This provides 17 independent global forecasts valid for two weeks every day. For efficient ensemble forecasting, the initial perturbations to the control

Ensembles of multi-instance learners

by Zhi-hua Zhou, Min-ling Zhang - In Proc of the 14th European Conf on Machine Learning , 2003
"... Abstract. In multi-instance learning, the training set comprises labeled bags that are composed of unlabeled instances, and the task is to predict the labels of unseen bags. Through analyzing two famous multi-instance learning algorithms, this paper shows that many supervised learning algorithms can ..."
Abstract - Cited by 28 (8 self) - Add to MetaCart
can be adapted to multi-instance learning, as long as their focuses are shifted from the discrimination on the instances to the discrimination on the bags. Moreover, considering that ensemble learning paradigms can effectively enhance supervised learners, this paper proposes to build ensembles

A simple, fast, and effective rule learner

by William W. Cohen, Yoram Singer - IN PROCEEDINGS OF ANNUAL CONFERENCE OFAMERICAN ASSOCIATION FOR ARTI CIAL INTELLIGENCE , 1999
"... We describe SLIPPER, a new rule learner that generates rulesets by repeatedly boosting a simple, greedy, rule-builder. Like the rulesets built by other rule learners, the ensemble of rules created by SLIPPER is compact and comprehensible. This is made possible by imposing appropriate constraints on ..."
Abstract - Cited by 119 (3 self) - Add to MetaCart
We describe SLIPPER, a new rule learner that generates rulesets by repeatedly boosting a simple, greedy, rule-builder. Like the rulesets built by other rule learners, the ensemble of rules created by SLIPPER is compact and comprehensible. This is made possible by imposing appropriate constraints

An Ensemble Technique for Stable Learners with Performance

by Bounds Ian Davidson, Ian Davidson , 2004
"... Ensemble techniques such as bagging and DECORATE exploit the "instability" of learners, such as decision trees, to create a diverse set of models. However their application to stable learners such as nave Bayes, does not yield as much improvement and can sometimes degrade performance a ..."
Abstract - Cited by 5 (1 self) - Add to MetaCart
Ensemble techniques such as bagging and DECORATE exploit the "instability" of learners, such as decision trees, to create a diverse set of models. However their application to stable learners such as nave Bayes, does not yield as much improvement and can sometimes degrade performance

Dynamic Weighted Majority: A New Ensemble Method for Tracking Concept Drift

by Jeremy Z. Kolter, Marcus A. Maloof , 2003
"... Algorithms for tracking concept drift are important for many applications. We present a general method based on the Weighted Majority algorithm for using any on-line learner for concept drift. Dynamic Weighted Majority (DWM) maintains an ensemble of base learners, predicts using a weighted-majority ..."
Abstract - Cited by 91 (0 self) - Add to MetaCart
Algorithms for tracking concept drift are important for many applications. We present a general method based on the Weighted Majority algorithm for using any on-line learner for concept drift. Dynamic Weighted Majority (DWM) maintains an ensemble of base learners, predicts using a weighted

Ensembles of Learning Machines

by Giorgio Valentini, Francesco Masulli - Neural Nets WIRN Vietri-02, Series Lecture Notes in Computer Sciences , 2002
"... Ensembles of learning machines constitute one of the main current directions in machine learning research, and have been applied to a wide range of real problems. Despite of the absence of an unified theory on ensembles, there are many theoretical reasons for combining multiple learners, and an empi ..."
Abstract - Cited by 57 (4 self) - Add to MetaCart
Ensembles of learning machines constitute one of the main current directions in machine learning research, and have been applied to a wide range of real problems. Despite of the absence of an unified theory on ensembles, there are many theoretical reasons for combining multiple learners

Ensembling local learners through multimodal perturbation

by Zhi-hua Zhou, Yang Yu - IEEE Transactions of Systems, Man, and Cybernetics, Part B: Cybernetics
"... Abstract — Ensemble learning algorithms train multiple com-ponent learners and then combine their predictions. In order to generate a strong ensemble, the component learners should be with high accuracy as well as high diversity. A popularly used scheme in generating accurate but diverse component l ..."
Abstract - Cited by 15 (3 self) - Add to MetaCart
Abstract — Ensemble learning algorithms train multiple com-ponent learners and then combine their predictions. In order to generate a strong ensemble, the component learners should be with high accuracy as well as high diversity. A popularly used scheme in generating accurate but diverse component
Next 10 →
Results 1 - 10 of 658
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University