Results 1  10
of
628
Mixing weak learners in semantic parsing
 in Proceedings of EMNLP2004
, 2004
"... We apply a novel variant of Random Forests (Breiman, 2001) to the shallow semantic parsing problem and show extremely promising results. The final system has a semantic role classification accuracy of 88.3 % using PropBank goldstandard parses. These results are better than all others published exce ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
We apply a novel variant of Random Forests (Breiman, 2001) to the shallow semantic parsing problem and show extremely promising results. The final system has a semantic role classification accuracy of 88.3 % using PropBank goldstandard parses. These results are better than all others published except those of the Support Vector Machine (SVM) approach implemented by Pradhan et al. (2003) and Random Forests have numerous advantages over SVMs including simplicity, faster training and classification, easier multiclass classification, and easier problemspecific customization. We also present new features which result in a 1.1% gain in classification accuracy and describe a technique that results in a 97 % reduction in the feature space with no significant degradation in accuracy. 1
Weak Learners and Improved Rates of Convergence in Boosting
 In Advances in Neural Information Processing Systems
, 2000
"... The problem constructing weak classifiers for boosting algorithms is studied. We present an algorithm that produces a linear classifier that is guaranteed to achieve an error better than random guessing for any distribution on the data. While this weak learner is not useful for learning in genera ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
The problem constructing weak classifiers for boosting algorithms is studied. We present an algorithm that produces a linear classifier that is guaranteed to achieve an error better than random guessing for any distribution on the data. While this weak learner is not useful for learning
On the Existence of Linear Weak Learners and Applications to Boosting
, 2002
"... We consider the existence of a linear weak learner for boosting algorithms. A weak learner for binary classification problems is required to achieve a weighted empirical error on the training set which is bounded from above by 1/2 # , #>0, for any distribution on the data set. Moreover, in ord ..."
Abstract
 Add to MetaCart
We consider the existence of a linear weak learner for boosting algorithms. A weak learner for binary classification problems is required to achieve a weighted empirical error on the training set which is bounded from above by 1/2 # , #>0, for any distribution on the data set. Moreover
A Geometric Approach to Leveraging Weak Learners
 Computational Learning Theory: 4th European Conference (EuroCOLT '99
, 1998
"... . AdaBoost is a popular and effective leveraging procedure for improving the hypotheses generated by weak learning algorithms. AdaBoost and many other leveraging algorithms can be viewed as performing a constrained gradient descent over a potential function. At each iteration the distribution over t ..."
Abstract

Cited by 27 (4 self)
 Add to MetaCart
the sample given to the weak learner is the direction of steepest descent. We introduce a new leveraging algorithm based on a natural potential function. For this potential function, the direction of steepest descent can have negative components. Therefore we provide two transformations for obtaining
Totally Corrective Multiclass Boosting with Binary Weak Learners
, 2010
"... Abstract—In this work, we propose a new optimization framework for multiclass boosting learning. In the literature, AdaBoost.MO and AdaBoost.ECC are the two successful multiclass boosting algorithms, which can use binary weak learners. We explicitly derive these two algorithms ’ Lagrange dual prob ..."
Abstract
 Add to MetaCart
Abstract—In this work, we propose a new optimization framework for multiclass boosting learning. In the literature, AdaBoost.MO and AdaBoost.ECC are the two successful multiclass boosting algorithms, which can use binary weak learners. We explicitly derive these two algorithms ’ Lagrange dual
The strength of weak learnability
 MACHINE LEARNING
, 1990
"... This paper addresses the problem of improving the accuracy of an hypothesis output by a learning algorithm in the distributionfree (PAC) learning model. A concept class is learnable (or strongly learnable) if, given access to a Source of examples of the unknown concept, the learner with high prob ..."
Abstract

Cited by 871 (26 self)
 Add to MetaCart
probability is able to output an hypothesis that is correct on all but an arbitrarily small fraction of the instances. The concept class is weakly learnable if the learner can produce an hypothesis that performs only slightly better than random guessing. In this paper, it is shown that these two notions
Bagging Very Weak Learners with Lazy Local Learning*
"... Abstract * Bagging predictors have shown to be effective especially when the learners used to train the base classifiers are weak. In this paper, we argue that for very weak (VW) learners, such as DecisionStump, OneR, and SuperPipes, the base classifiers built from boostrap bags are strongly correla ..."
Abstract
 Add to MetaCart
Abstract * Bagging predictors have shown to be effective especially when the learners used to train the base classifiers are weak. In this paper, we argue that for very weak (VW) learners, such as DecisionStump, OneR, and SuperPipes, the base classifiers built from boostrap bags are strongly
Fast Weak Learner Based on Genetic Algorithm ∗
"... An approach to the acceleration of parametric weak classifier boosting is proposed. Weak classifier is called parametric if it has fixed number of parameters and, therefore, can be represented as a point in multidimensional space. Genetic algorithm is used to learn parameters of such classifier. Pro ..."
Abstract
 Add to MetaCart
An approach to the acceleration of parametric weak classifier boosting is proposed. Weak classifier is called parametric if it has fixed number of parameters and, therefore, can be represented as a point in multidimensional space. Genetic algorithm is used to learn parameters of such classifier
Combining weak learners with probabilistic models for binary classification
"... In this project it is addressed the problem of binary classification i.e. we want to …nd a model F: X! f0; 1g for the dependence of a class c on x from a set of samples D = fct; xtg N t=1 with ct 2 f0; 1g. The approach is to combine multiple base learners with the use of probabilistic models. In par ..."
Abstract
 Add to MetaCart
In this project it is addressed the problem of binary classification i.e. we want to …nd a model F: X! f0; 1g for the dependence of a class c on x from a set of samples D = fct; xtg N t=1 with ct 2 f0; 1g. The approach is to combine multiple base learners with the use of probabilistic models
Mining Direct Marketing Data by Ensembles of Weak Learners and Rough Set Methods
"... Abstract. This paper describes problem of prediction that is based on direct marketing data coming from Nationwide Products and Services Questionnaire (NPSQ) prepared by Polish division of Acxiom Corporation. The problem that we analyze is stated as prediction of accessibility to Internet. Unit of ..."
Abstract
 Add to MetaCart
of the analysis corresponds to a group of individuals in certain age category living in a certain building located in Poland. We used several machine learning methods to build our prediction models. Particularly, we applied ensembles of weak learners and ModLEM algorithm that is based on rough set approach
Results 1  10
of
628