Results 1  10
of
1,223,511
The Nature of Statistical Learning Theory
, 1999
"... Statistical learning theory was introduced in the late 1960’s. Until the 1990’s it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990’s new types of learning algorithms (called support vector machines) based on the deve ..."
Abstract

Cited by 12927 (32 self)
 Add to MetaCart
Statistical learning theory was introduced in the late 1960’s. Until the 1990’s it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990’s new types of learning algorithms (called support vector machines) based
Incremental and Decremental Learning for Linear Support Vector Machines
"... Abstract. We present a method to find the exact maximal margin hyperplane for linear Support Vector Machines when a new (existing) component is added (removed) to (from) the inner product. The maximal margin hyperplane with the new inner product is obtained in terms of that for the old inner product ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. We present a method to find the exact maximal margin hyperplane for linear Support Vector Machines when a new (existing) component is added (removed) to (from) the inner product. The maximal margin hyperplane with the new inner product is obtained in terms of that for the old inner
Rule extraction from linear support vector machines
 In KDD
, 2005
"... We describe an algorithm for converting linear support vector machines and any other arbitrary hyperplanebased linear classifiers into a set of nonoverlapping rules that, unlike the original classifier, can be easily interpreted by humans. Each iteration of the rule extraction algorithm is formula ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
We describe an algorithm for converting linear support vector machines and any other arbitrary hyperplanebased linear classifiers into a set of nonoverlapping rules that, unlike the original classifier, can be easily interpreted by humans. Each iteration of the rule extraction algorithm
A Bahadur Representation of the Linear Support Vector Machine
"... Editor: John ShaweTaylor The support vector machine has been successful in a variety of applications. Also on the theoretical front, statistical properties of the support vector machine have been studied quite extensively with a particular attention to its Bayes risk consistency under some conditio ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
conditions. In this paper, we study somewhat basic statistical properties of the support vector machine yet to be investigated, namely the asymptotic behavior of the coefficients of the linear support vector machine. A Bahadur type representation of the coefficients is established under appropriate
Consensusbased distributed linear support vector machines
 In ACM/IEEE International Conference on Information Processing in Sensor Networks
, 2010
"... This paper develops algorithms to train linear support vector machines (SVMs) when training data are distributed across different nodes and their communication to a centralized node is prohibited due to, for example, communication overhead or privacy reasons. To accomplish this goal, the centralized ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper develops algorithms to train linear support vector machines (SVMs) when training data are distributed across different nodes and their communication to a centralized node is prohibited due to, for example, communication overhead or privacy reasons. To accomplish this goal
Deep learning using linear support vector machines
 In ICML
, 2013
"... Recently, fullyconnected and convolutional neural networks have been trained to achieve stateoftheart performance on a wide variety of tasks such as speech recognition, image classification, natural language processing, and bioinformatics. For classification tasks, most of these “deep learnin ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
learning ” models employ the softmax activation function for prediction and minimize crossentropy loss. In this paper, we demonstrate a small but consistent advantage of replacing the softmax layer with a linear support vector machine. Learning minimizes a marginbased loss instead of the cross
Multiclass Latent Locally Linear Support Vector Machines
"... Kernelized Support Vector Machines (SVM) have gained the status of offtheshelf classifiers, able to deliver state of the art performance on almost any problem. Still, their practical use is constrained by their computational and memory complexity, which grows superlinearly with the number of tra ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Kernelized Support Vector Machines (SVM) have gained the status of offtheshelf classifiers, able to deliver state of the art performance on almost any problem. Still, their practical use is constrained by their computational and memory complexity, which grows superlinearly with the number
Nomograms for Visualizing Linear Support Vector Machines
"... Support vector machines are often considered to be black box learning algorithms. We show that for linear kernels it is possible to open this box and visually depict the content of the SVM classifier in highdimensional space in the interactive format of a nomogram. We provide a crosscalibration met ..."
Abstract
 Add to MetaCart
Support vector machines are often considered to be black box learning algorithms. We show that for linear kernels it is possible to open this box and visually depict the content of the SVM classifier in highdimensional space in the interactive format of a nomogram. We provide a crosscalibration
Hybrid mpi/openmp parallel linear support vector machine training
 JMLR
"... Support vector machines are a powerful machine learning technology, but the training process involves a dense quadratic optimization problem and is computationally challenging. A parallel implementation of linear Support Vector Machine training has been developed, using a combination of MPI and Open ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Support vector machines are a powerful machine learning technology, but the training process involves a dense quadratic optimization problem and is computationally challenging. A parallel implementation of linear Support Vector Machine training has been developed, using a combination of MPI
Research Track Paper Rule Extraction from Linear Support Vector Machines
"... We describe an algorithm for converting linear support vector machines and any other arbitrary hyperplanebased linear classifiers into a set of nonoverlapping rules that, unlike the original classifier, can be easily interpreted by humans. Each iteration of the rule extraction algorithm is formula ..."
Abstract
 Add to MetaCart
We describe an algorithm for converting linear support vector machines and any other arbitrary hyperplanebased linear classifiers into a set of nonoverlapping rules that, unlike the original classifier, can be easily interpreted by humans. Each iteration of the rule extraction algorithm
Results 1  10
of
1,223,511