Results 1  10
of
29,076
The Nature of Statistical Learning Theory
, 1999
"... Statistical learning theory was introduced in the late 1960’s. Until the 1990’s it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990’s new types of learning algorithms (called support vector machines) based on the deve ..."
Abstract

Cited by 13236 (32 self)
 Add to MetaCart
Statistical learning theory was introduced in the late 1960’s. Until the 1990’s it was a purely theoretical analysis of the problem of function estimation from a given collection of data. In the middle of the 1990’s new types of learning algorithms (called support vector machines) based
Support Vector Regression Machines
, 1996
"... A new regression technique based on Vapnik's concept of support vectors is introduced. We compare support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space. On the basis of these experiments, it is expect ..."
Abstract

Cited by 256 (10 self)
 Add to MetaCart
A new regression technique based on Vapnik's concept of support vectors is introduced. We compare support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space. On the basis of these experiments
A tutorial on support vector regression
, 2004
"... In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing ..."
Abstract

Cited by 865 (3 self)
 Add to MetaCart
In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing
Bayesian Support Vector Regression
 In Proceedings of the Eighth International Workshop on Artificial Intelligence and Statistics
, 2001
"... We show that the Bayesian evidence framework can be applied to both epsilonsupport vector regression (epsilonSVR) and nusupport vector regression (nuSVR) algorithms. Standard SVR training can be regarded as performing level one inference of the evidence framework, while levels two and three allo ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
We show that the Bayesian evidence framework can be applied to both epsilonsupport vector regression (epsilonSVR) and nusupport vector regression (nuSVR) algorithms. Standard SVR training can be regarded as performing level one inference of the evidence framework, while levels two and three
Support vector regression
 Neural Information Processing Letters and Reviews
, 2007
"... Abstract − Instead of minimizing the observed training error, Support Vector Regression (SVR) attempts to minimize the generalization error bound so as to achieve generalized performance. The idea of SVR is based on the computation of a linear regression function in a high dimensional feature space ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
Abstract − Instead of minimizing the observed training error, Support Vector Regression (SVR) attempts to minimize the generalization error bound so as to achieve generalized performance. The idea of SVR is based on the computation of a linear regression function in a high dimensional feature space
Complex Support Vector Regression
"... AbstractWe present a support vector regression (SVR) rationale for treating complex data, exploiting the notions of widely linear estimation and pure complex kernels. To compute the Lagrangian and derive the dual problem, we employ the recently presented Wirtinger's calculus on complex RKHS. ..."
Abstract
 Add to MetaCart
AbstractWe present a support vector regression (SVR) rationale for treating complex data, exploiting the notions of widely linear estimation and pure complex kernels. To compute the Lagrangian and derive the dual problem, we employ the recently presented Wirtinger's calculus on complex RKHS
SUPPORT VECTOR REGRESSION VIAMATHEMATICA
, 2005
"... In this tutorial type paper aMathematica function for Support Vector Regression has been developed. Summarizing the main definitions and theorems of SVR, the detailed implementation steps of this function are presented and its application is illustrated by solving three 2D function approximation tes ..."
Abstract
 Add to MetaCart
In this tutorial type paper aMathematica function for Support Vector Regression has been developed. Summarizing the main definitions and theorems of SVR, the detailed implementation steps of this function are presented and its application is illustrated by solving three 2D function approximation
Support Vector Regression ANalysis . . .
, 2009
"... To remain profitable under a tight competition, a leasing company has to offer a good leasing price. In order to determine the right price, it is necessary to predict the future price of a second hand car. By knowing the car's value depreciation, the leasing price could be set to cover it. The ..."
Abstract
 Add to MetaCart
technique which is independent of input dimension, namely Support Vector Regression, will be applied to overcome this potential problem. The forecasting accuracy will then be compared against the statistical regression model. In particular, a fully automatic approach for tuning and applying SVR is developed
Accurate Online Support Vector Regression
 Neural Computation
, 2003
"... Conventional batch implementations of Support Vector Regression (SVR) are inefficient when used for applications such as online learning or leaveoneout crossvalidation, because they must be retrained from scratch every time the training set is modified. An Accurate Online Support Vector Regressio ..."
Abstract

Cited by 36 (2 self)
 Add to MetaCart
Conventional batch implementations of Support Vector Regression (SVR) are inefficient when used for applications such as online learning or leaveoneout crossvalidation, because they must be retrained from scratch every time the training set is modified. An Accurate Online Support Vector
Massive Support Vector Regression
 Data Mining Institute, Computer Sciences Department, University of Wisconsin
, 1999
"... The problem of tolerant data fitting by a nonlinear surface, induced by a kernelbased support vector machine [19], is formulated as a linear program with fewer number of variables than that of other linear programming formulations [17]. A generalization of the linear programming chunking algorithm ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
The problem of tolerant data fitting by a nonlinear surface, induced by a kernelbased support vector machine [19], is formulated as a linear program with fewer number of variables than that of other linear programming formulations [17]. A generalization of the linear programming chunking algorithm
Results 1  10
of
29,076