Results 11  20
of
15,369
Hybrid mpi/openmp parallel linear support vector machine training
 JMLR
"... Support vector machines are a powerful machine learning technology, but the training process involves a dense quadratic optimization problem and is computationally challenging. A parallel implementation of linear Support Vector Machine training has been developed, using a combination of MPI and Open ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Support vector machines are a powerful machine learning technology, but the training process involves a dense quadratic optimization problem and is computationally challenging. A parallel implementation of linear Support Vector Machine training has been developed, using a combination of MPI
Exploiting separability in largescale linear Support Vector Machine training
, 2009
"... Linear support vector machine training can be represented as a large quadratic program. We present an efficient and numerically stable algorithm for this problem using interior point methods, which requires only O(n) operations per iteration. Through exploiting the separability of the Hessian, we p ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
Linear support vector machine training can be represented as a large quadratic program. We present an efficient and numerically stable algorithm for this problem using interior point methods, which requires only O(n) operations per iteration. Through exploiting the separability of the Hessian, we
Locally Linear Support Vector Machines L’ubor Ladick´y
"... Linear support vector machines (svms) have become popular for solving classification tasks due to their fast and simple online application to large scale data sets. However, many problems are not linearly separable. For these problems kernelbased svms are often used, but unlike their linear variant ..."
Abstract

Cited by 24 (1 self)
 Add to MetaCart
Linear support vector machines (svms) have become popular for solving classification tasks due to their fast and simple online application to large scale data sets. However, many problems are not linearly separable. For these problems kernelbased svms are often used, but unlike their linear
Research Track Paper Rule Extraction from Linear Support Vector Machines
"... We describe an algorithm for converting linear support vector machines and any other arbitrary hyperplanebased linear classifiers into a set of nonoverlapping rules that, unlike the original classifier, can be easily interpreted by humans. Each iteration of the rule extraction algorithm is formula ..."
Abstract
 Add to MetaCart
We describe an algorithm for converting linear support vector machines and any other arbitrary hyperplanebased linear classifiers into a set of nonoverlapping rules that, unlike the original classifier, can be easily interpreted by humans. Each iteration of the rule extraction algorithm
Largescale logistic regression and linear support vector machines using Spark
 in Proceedings of the IEEE International Conference on Big Data
, 2014
"... AbstractLogistic regression and linear SVM are useful methods for largescale classification. However, their distributed implementations have not been well studied. Recently, because of the inefficiency of the MapReduce framework on iterative algorithms, Spark, an inmemory clustercomputing platf ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
AbstractLogistic regression and linear SVM are useful methods for largescale classification. However, their distributed implementations have not been well studied. Recently, because of the inefficiency of the MapReduce framework on iterative algorithms, Spark, an inmemory cluster
Linear Support Vector Machines via Dual Cached Loops Shin Matsushima
"... Modern computer hardware offers an elaborate hierarchy of storage subsystems with different speeds and capacities. Furthermore, processors are now inherently parallel offering the execution of several diverse threads simultaneously. This paper proposes a first SVM optimization algorithm which takes ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
increases throughput. Experiments show that it outperforms other linear SVM solvers, including the award winning work of [?], by orders of magnitude and produces more accurate solutions within a given amount of time. 1.
Improving the Generalization of Linear Support Vector Machines: an Application to 3D Object Recognition with Cluttered Background
 In Proceedings, Support Vector Machine Workshop at the 16th International Joint Conference on Artificial Intelligence
, 1999
"... Three methods for improving the generalization performance of Linear Support Vector Machines are proposed and this in the case that some dimensions in the data can be considered irrelevant for the pattern recognition task at hand. In contrast to other methods, the generalization improvement is not o ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Three methods for improving the generalization performance of Linear Support Vector Machines are proposed and this in the case that some dimensions in the data can be considered irrelevant for the pattern recognition task at hand. In contrast to other methods, the generalization improvement
SVMMaj: A Majorization Approach to Linear Support Vector Machines with Different Hinge Errors
, 2007
"... Support vector machines (SVM) are becoming increasingly popular for the prediction of a binary dependent variable. SVMs perform very well with respect to competing techniques. Often, the solution of an SVM is obtained by switching to the dual. In this paper, we stick to the primal support vector mac ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Support vector machines (SVM) are becoming increasingly popular for the prediction of a binary dependent variable. SVMs perform very well with respect to competing techniques. Often, the solution of an SVM is obtained by switching to the dual. In this paper, we stick to the primal support vector
Supplement Materials for “Largescale Logistic Regression and Linear Support Vector Machines Using Spark”
"... This document presents some materials not included in the paper. In Section 2, the details of applying the distributed TRON algorithm to solve L2loss SVM are described. We ..."
Abstract
 Add to MetaCart
This document presents some materials not included in the paper. In Section 2, the details of applying the distributed TRON algorithm to solve L2loss SVM are described. We
A tutorial on support vector machines for pattern recognition
 Data Mining and Knowledge Discovery
, 1998
"... The tutorial starts with an overview of the concepts of VC dimension and structural risk minimization. We then describe linear Support Vector Machines (SVMs) for separable and nonseparable data, working through a nontrivial example in detail. We describe a mechanical analogy, and discuss when SV ..."
Abstract

Cited by 3393 (12 self)
 Add to MetaCart
The tutorial starts with an overview of the concepts of VC dimension and structural risk minimization. We then describe linear Support Vector Machines (SVMs) for separable and nonseparable data, working through a nontrivial example in detail. We describe a mechanical analogy, and discuss when
Results 11  20
of
15,369