Results 1  10
of
448,292
Oneclass SVM formulations for Multiple Instance learning
"... Multiple Instance learning (MIL) considers a particular form of weak supervision in which the learner is given a set of positive bags and negative bags. Positive bags are sets of instances containing atleast one positive example and negative bags are sets of instances all of which are negative. A nu ..."
Abstract
 Add to MetaCart
Multiple Instance learning (MIL) considers a particular form of weak supervision in which the learner is given a set of positive bags and negative bags. Positive bags are sets of instances containing atleast one positive example and negative bags are sets of instances all of which are negative. A
OneClass SVM in MultiTask Learning
"... ABSTRACT: MultiTask Learning (MTL) has become an active research topic in recent years. While most machine learning methods focus on the learning of tasks independently, multitask learning aims to improve the generalization performance by training multiple related tasks simultaneously. This paper ..."
Abstract
 Add to MetaCart
presents a new approach to multitask learning based on oneclass Support Vector Machine (oneclass SVM). In the proposed approach, we first make the assumption that the model parameter values of different tasks are close to a certain mean value. Then, a number of oneclass SVMs, one for each task
A Comparison of Methods for Multiclass Support Vector Machines
 IEEE TRANS. NEURAL NETWORKS
, 2002
"... Support vector machines (SVMs) were originally designed for binary classification. How to effectively extend it for multiclass classification is still an ongoing research issue. Several methods have been proposed where typically we construct a multiclass classifier by combining several binary class ..."
Abstract

Cited by 952 (22 self)
 Add to MetaCart
classifiers. Some authors also proposed methods that consider all classes at once. As it is computationally more expensive to solve multiclass problems, comparisons of these methods using largescale problems have not been seriously conducted. Especially for methods solving multiclass SVM in one step, a much
Training Linear SVMs in Linear Time
, 2006
"... Linear Support Vector Machines (SVMs) have become one of the most prominent machine learning techniques for highdimensional sparse data commonly encountered in applications like text classification, wordsense disambiguation, and drug design. These applications involve a large number of examples n ..."
Abstract

Cited by 549 (6 self)
 Add to MetaCart
is based on an alternative, but equivalent formulation of the SVM optimization problem. Empirically, the CuttingPlane Algorithm is several orders of magnitude faster than decomposition methods like SVMLight for large datasets.
Large margin methods for structured and interdependent output variables
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2005
"... Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses the complementary ..."
Abstract

Cited by 624 (12 self)
 Add to MetaCart
Learning general functional dependencies between arbitrary input and output spaces is one of the key challenges in computational intelligence. While recent progress in machine learning has mainly focused on designing flexible and powerful input representations, this paper addresses
Quantum complexity theory
 in Proc. 25th Annual ACM Symposium on Theory of Computing, ACM
, 1993
"... Abstract. In this paper we study quantum computation from a complexity theoretic viewpoint. Our first result is the existence of an efficient universal quantum Turing machine in Deutsch’s model of a quantum Turing machine (QTM) [Proc. Roy. Soc. London Ser. A, 400 (1985), pp. 97–117]. This constructi ..."
Abstract

Cited by 574 (5 self)
 Add to MetaCart
the modern (complexity theoretic) formulation of the Church–Turing thesis. We show the existence of a problem, relative to an oracle, that can be solved in polynomial time on a quantum Turing machine, but requires superpolynomial time on a boundederror probabilistic Turing machine, and thus not in the class
Benchmarking Least Squares Support Vector Machine Classifiers
 NEURAL PROCESSING LETTERS
, 2001
"... In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set of eq ..."
Abstract

Cited by 476 (46 self)
 Add to MetaCart
of equations in the dual space. While the SVM classifier has a large margin interpretation, the LSSVM formulation is related in this paper to a ridge regression approach for classification with binary targets and to Fisher's linear discriminant analysis in the feature space. Multiclass categorization
Minimax Programs
 University of California Press
, 1997
"... We introduce an optimization problem called a minimax program that is similar to a linear program, except that the addition operator is replaced in the constraint equations by the maximum operator. We clarify the relation of this problem to some betterknown problems. We identify an interesting spec ..."
Abstract

Cited by 482 (5 self)
 Add to MetaCart
highly effective algorithms for solution of various classes of linear programs. Linear programming represents one of the major achievements of the operations research and mathematical programming community. Supported in part by a National Science Foundation Graduate Fellowship. In this paper we
OneClass SVMs for Document Classification
 Journal of Machine Learning Research
, 2001
"... We implemented versions of the SVM appropriate for oneclass classification in the context of information retrieval. The experiments were conducted on the standard Reuters data set. For the SVM implementation we used both a version of Schölkopf et al. and a somewhat different version of oneclass SV ..."
Abstract

Cited by 185 (3 self)
 Add to MetaCart
We implemented versions of the SVM appropriate for oneclass classification in the context of information retrieval. The experiments were conducted on the standard Reuters data set. For the SVM implementation we used both a version of Schölkopf et al. and a somewhat different version of oneclass
Multiple kernel learning, conic duality, and the SMO algorithm
 In Proceedings of the 21st International Conference on Machine Learning (ICML
, 2004
"... While classical kernelbased classifiers are based on a single kernel, in practice it is often desirable to base classifiers on combinations of multiple kernels. Lanckriet et al. (2004) considered conic combinations of kernel matrices for the support vector machine (SVM), and showed that the optimiz ..."
Abstract

Cited by 445 (31 self)
 Add to MetaCart
; moreover, the sequential minimal optimization (SMO) techniques that are essential in largescale implementations of the SVM cannot be applied because the cost function is nondifferentiable. We propose a novel dual formulation of the QCQP as a secondorder cone programming problem, and show how to exploit
Results 1  10
of
448,292