Results 1  10
of
206
Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2000
"... We present a unifying framework for studying the solution of multiclass categorization problems by reducing them to multiple binary problems that are then solved using a marginbased binary learning algorithm. The proposed framework unifies some of the most popular approaches in which each class ..."
Abstract

Cited by 561 (20 self)
 Add to MetaCart
generalization error analysis for general output codes with AdaBoost as the binary learner. Experimental results with SVM and AdaBoost show that our scheme provides a viable alternative to the most commonly used multiclass algorithms.
Boosting with Online Binary Learners for the Multiclass Bandit Problem
"... We consider the problem of online multiclass prediction in the bandit setting. Compared with the fullinformation setting, in which the learner can receive the true label as feedback after making each prediction, the bandit setting assumes that the learner can only know the correctness of the predi ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
of the predicted label. Because the bandit setting is more restricted, it is difficult to design good bandit learners and currently there are not many bandit learners. In this paper, we propose an approach that systematically converts existing online binary classifiers to promising bandit learners with strong
Erratum: Constructing Multiclass Learners from Binary Learners: A Simple BlackBox Analysis of the Generalization Errors
"... Abstract. There are errors in our paper “Constructing Multiclass Learners from Binary Learners: A ..."
Abstract
 Add to MetaCart
Abstract. There are errors in our paper “Constructing Multiclass Learners from Binary Learners: A
Improving generalization with active learning
 Machine Learning
, 1994
"... Abstract. Active learning differs from "learning from examples " in that the learning algorithm assumes at least some control over what part of the input domain it receives information about. In some situations, active learning is provably more powerful than learning from examples ..."
Abstract

Cited by 544 (1 self)
 Add to MetaCart
alone, giving better generalization for a fixed number of training examples. In this article, we consider the problem of learning a binary concept in the absence of noise. We describe a formalism for active concept learning called selective sampling and show how it may be approximately implemented by a
Enhancements of Multiclass Support Vector Machine Construction from Binary Learners using Generalization Performance
"... We propose several novel methods for enhancing the multiclass SVMs by applying the generalization performance of binary classifiers as the core idea. This concept will be applied on the existing algorithms, i.e., the Decision Directed Acyclic Graph (DDAG), the Adaptive Directed Acyclic Graphs (ADAG ..."
Abstract
 Add to MetaCart
We propose several novel methods for enhancing the multiclass SVMs by applying the generalization performance of binary classifiers as the core idea. This concept will be applied on the existing algorithms, i.e., the Decision Directed Acyclic Graph (DDAG), the Adaptive Directed Acyclic Graphs
Combining weak learners with probabilistic models for binary classification
"... In this project it is addressed the problem of binary classification i.e. we want to …nd a model F: X! f0; 1g for the dependence of a class c on x from a set of samples D = fct; xtg N t=1 with ct 2 f0; 1g. The approach is to combine multiple base learners with the use of probabilistic models. In par ..."
Abstract
 Add to MetaCart
In this project it is addressed the problem of binary classification i.e. we want to …nd a model F: X! f0; 1g for the dependence of a class c on x from a set of samples D = fct; xtg N t=1 with ct 2 f0; 1g. The approach is to combine multiple base learners with the use of probabilistic models
Totally Corrective Multiclass Boosting with Binary Weak Learners
, 2010
"... Abstract—In this work, we propose a new optimization framework for multiclass boosting learning. In the literature, AdaBoost.MO and AdaBoost.ECC are the two successful multiclass boosting algorithms, which can use binary weak learners. We explicitly derive these two algorithms ’ Lagrange dual prob ..."
Abstract
 Add to MetaCart
Abstract—In this work, we propose a new optimization framework for multiclass boosting learning. In the literature, AdaBoost.MO and AdaBoost.ECC are the two successful multiclass boosting algorithms, which can use binary weak learners. We explicitly derive these two algorithms ’ Lagrange dual
Combined Binary Classifiers With Applications To Speech Recognition
 NEARESTNEIGHBOR ECOC WITH APPLICATION TO ALLPAIRS MULTICLASS SVM
, 2002
"... Many applications require classification of examples into one of several classes. A common way of designing such classifiers is to determine the class based on the outputs of several binary classifiers. We consider some of the most popular methods for combining the decisions of the binary classifier ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
classifiers, and improve existing bounds on the error rates of the combined classifier over the training set. We also describe a new method for combining binary classifiers. The method is based on stacking a neural network and, when used with support vector machines as the binary learners, substantially
COMBINED BINARY CLASSIFIERS WITH APPLICATIONS TO SPEECH RECOGNITION
"... Many applications require classification of examples into one of several classes. A common way of designing such classifiers is to determine the class based on the outputs of several binary classifiers. We consider some of the most popular methods for combining the decisions of the binary classifier ..."
Abstract
 Add to MetaCart
classifiers, and improve existing bounds on the error rates of the combined classifier over the training set. We also describe a new method for combining binary classifiers. The method is based on stacking a neural network and, when used with support vector machines as the binary learners, substantially
Boosting with decision stumps and binary features
, 2003
"... A special case of boosting is when features are binary and the base learner is ..."
Abstract
 Add to MetaCart
A special case of boosting is when features are binary and the base learner is
Results 1  10
of
206