• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 119
Next 10 →

Making Large-Scale SVM Learning Practical

by Thorsten Joachims , 1998
"... Training a support vector machine (SVM) leads to a quadratic optimization problem with bound constraints and one linear equality constraint. Despite the fact that this type of problem is well understood, there are many issues to be considered in designing an SVM learner. In particular, for large lea ..."
Abstract - Cited by 1861 (17 self) - Add to MetaCart
Training a support vector machine (SVM) leads to a quadratic optimization problem with bound constraints and one linear equality constraint. Despite the fact that this type of problem is well understood, there are many issues to be considered in designing an SVM learner. In particular, for large

Making Large-Scale Support Vector Machine Learning Practical

by Thorsten Joachims , 1998
"... Training a support vector machine (SVM) leads to a quadratic optimization problem with bound constraints and one linear equality constraint. Despite the fact that this type of problem is well understood, there are many issues to be considered in designing an SVM learner. In particular, for large lea ..."
Abstract - Cited by 628 (1 self) - Add to MetaCart
Training a support vector machine (SVM) leads to a quadratic optimization problem with bound constraints and one linear equality constraint. Despite the fact that this type of problem is well understood, there are many issues to be considered in designing an SVM learner. In particular, for large

Exploiting Code-Redundancies in ECOC for Reducing its Training Complexity using Incremental and SVM Learners

by Sang-hyeun Park, Lorenz Weizsäcker, Johannes Fürnkranz , 2010
"... We study an approach for speeding up the training of error-correcting output codes (ECOC) classifiers. The key idea is to avoid unnecessary computations by exploiting the overlap of the different training sets in the ECOC ensemble. Instead of re-training each classifier from scratch, classifiers tha ..."
Abstract - Add to MetaCart
. We experimentally evaluate the algorithm with Hoeffding trees, as an example for incremental learners, where the classifier adaptation is trivial, and with SVMs, where we employ an adaptation strategy based on adapted caching and weight re-use, which guarantees that the learned model

Reducing Multiclass to Binary: A Unifying Approach for Margin Classifiers

by Erin L. Allwein, Robert E. Schapire, Yoram Singer - JOURNAL OF MACHINE LEARNING RESEARCH , 2000
"... We present a unifying framework for studying the solution of multiclass categorization problems by reducing them to multiple binary problems that are then solved using a margin-based binary learning algorithm. The proposed framework unifies some of the most popular approaches in which each class ..."
Abstract - Cited by 561 (20 self) - Add to MetaCart
generalization error analysis for general output codes with AdaBoost as the binary learner. Experimental results with SVM and AdaBoost show that our scheme provides a viable alternative to the most commonly used multiclass algorithms.

Learner

by Thorsten Joachims, The U. K, Based E. D, F. Man, Yeo Hiap, Seng Enterprises
"... 0 What is an SVM? ..."
Abstract - Add to MetaCart
0 What is an SVM?

SVM: Reduction of Learning Time

by Sid Ahmed Mostefaoui, Lynda Zaoui
"... Training a support vector machine (SVM) leads to a quadratic optimization problem with bound constraints and one linear equality constraint. Despite the fact that this type of problem is well understood, there are many issues to be considered in designing an SVM learner. In particular, for large lea ..."
Abstract - Add to MetaCart
Training a support vector machine (SVM) leads to a quadratic optimization problem with bound constraints and one linear equality constraint. Despite the fact that this type of problem is well understood, there are many issues to be considered in designing an SVM learner. In particular, for large

Bias-Variance Analysis and Ensembles of SVM

by Giorgio Valentini, Thomas G. Dietterich , 2002
"... Accuracy, diversity, and learning characteristics of base learners critically influence the effectiveness of ensemble methods. Bias-variance decomposition of the error can be used as a tool to gain insights into the behavior of learning algorithms, in order to properly design ensemble methods well-t ..."
Abstract - Cited by 9 (2 self) - Add to MetaCart
-tuned to the properties of a specific base learner. In this work we analyse bias-variance decomposition of the error in Support Vector Machines (SVM), characterizing it with respect to the kernel and its parameters. We show that the bias-variance decomposition offers a rationale to develop ensemble methods using SVMs

Good Learners for Evil Teachers

by Ofer Dekel, Ohad Shamir
"... We consider a supervised machine learning scenario where labels are provided by a heterogeneous set of teachers, some of which are mediocre, incompetent, or perhaps even malicious. We present an algorithm, built on the SVM framework, that explicitly attempts to cope with low-quality and malicious te ..."
Abstract - Cited by 34 (2 self) - Add to MetaCart
We consider a supervised machine learning scenario where labels are provided by a heterogeneous set of teachers, some of which are mediocre, incompetent, or perhaps even malicious. We present an algorithm, built on the SVM framework, that explicitly attempts to cope with low-quality and malicious

Machine Learning manuscript No. (will be inserted by the editor) RandSVM: A Randomized Algorithm for training Support Vector Machines on Large Datasets

by Vinay Jethava, Krishnan Suresh Chiranjib, Bhattacharyya Ramesh Hariharan, Vinay Jethava , 909
"... Abstract We propose a randomized algorithm for training Support vector machines(SVMs) on large datasets. By using ideas from Random projections we show that the combinatorial dimension of SVMs is O(log n) with high probability. This estimate of combinatorial dimension is used to derive an iterative ..."
Abstract - Add to MetaCart
life data sets demonstrate that the algorithm scales up existing SVM learners, without loss of accuracy.

SVM-BASED NEGATIVE DATA MINING TO BINARY CLASSIFICATION

by Fuhua Jiang , 2006
"... The properties of training data set such as size, distribution and number of attributes significantly contribute to the generalization error of a learning machine. A not-well-distributed data set is prone to lead to a partial overfitting model. The two approaches proposed in this paper for the binar ..."
Abstract - Add to MetaCart
algorithm learner by creating one or two additional hypothesis audit and booster to mine the negative examples output from the learner. The learner employs a regular support vector machine to classify main examples and recognize which examples are negative. The audit works on the negative training data
Next 10 →
Results 1 - 10 of 119
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University