Results 1  10
of
302
A tutorial on support vector regression
, 2004
"... In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing ..."
Abstract

Cited by 865 (3 self)
 Add to MetaCart
In this tutorial we give an overview of the basic ideas underlying Support Vector (SV) machines for function estimation. Furthermore, we include a summary of currently used algorithms for training SV machines, covering both the quadratic (or convex) programming part and advanced methods for dealing
Making LargeScale SVM Learning Practical
, 1998
"... Training a support vector machine (SVM) leads to a quadratic optimization problem with bound constraints and one linear equality constraint. Despite the fact that this type of problem is well understood, there are many issues to be considered in designing an SVM learner. In particular, for large lea ..."
Abstract

Cited by 1861 (17 self)
 Add to MetaCart
learning tasks with many training examples, offtheshelf optimization techniques for general quadratic programs quickly become intractable in their memory and time requirements. SV M light1 is an implementation of an SVM learner which addresses the problem of large tasks. This chapter presents algorithmic
Comparing Support Vector Machines with Gaussian Kernels to Radial Basis Function Classifiers
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 1997
"... The Support Vector (SV) machine is a novel type of learning machine, based on statistical learning theory, which contains polynomial classifiers, neural networks, and radial basis function (RBF) networks as special cases. In the RBF case, the SV algorithm automatically determines centers, weights an ..."
Abstract

Cited by 183 (13 self)
 Add to MetaCart
The Support Vector (SV) machine is a novel type of learning machine, based on statistical learning theory, which contains polynomial classifiers, neural networks, and radial basis function (RBF) networks as special cases. In the RBF case, the SV algorithm automatically determines centers, weights
Preference Queries with SVSemantics
, 2005
"... Personalization of database queries requires a semantically rich, easy to handle and flexible preference model. Building on preferences as strict partial orders we provide a variety of intuitive base preference constructors for numerical and categorical data, including socalled dparameters. As a n ..."
Abstract

Cited by 40 (9 self)
 Add to MetaCart
show that known laws from preference relational algebra remain valid under SVsemantics. Since most of these laws rely on transitivity, preservation of strict partial order is essential to algebraically optimize complex preference queries. Similarly, wellknown efficient evaluation algorithms
SV MIXTURE, CLASSIFICATION USING EM ALGORITHM
"... ABSTRACT The present paper presents a theoretical extension of our earlier work entitled"A comparative study of two models SV with MCMC algorithm" cited, Rev Quant Finan Acc (2012) ..."
Abstract
 Add to MetaCart
ABSTRACT The present paper presents a theoretical extension of our earlier work entitled"A comparative study of two models SV with MCMC algorithm" cited, Rev Quant Finan Acc (2012)
SV Estimation of a Distribution's Support
, 1999
"... Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified 0 < 1. We propose an algorithm which appro ..."
Abstract

Cited by 38 (2 self)
 Add to MetaCart
Suppose you are given some dataset drawn from an underlying probability distribution P and you want to estimate a subset S of input space such that the probability that a test point drawn from P lies outside of S is bounded by some a priori specified 0 < 1. We propose an algorithm which
Input Space Versus Feature Space in KernelBased Methods
 IEEE TRANSACTIONS ON NEURAL NETWORKS
, 1999
"... This paper collects some ideas targeted at advancing our understanding of the feature spaces associated with support vector (SV) kernel functions. We first discuss the geometry of feature space. In particular, we review what is known about the shape of the image of input space under the feature spac ..."
Abstract

Cited by 130 (3 self)
 Add to MetaCart
of kernel methods. First, we use it to reduce the computational complexity of SV decision functions; second, we combine it with the Kernel PCA algorithm, thereby constructing a nonlinear statistical denoising technique which is shown to perform well on realworld data.
Competition on Software Verification (SVCOMP)
, 2012
"... This report describes the definitions, rules, setup, procedure, and results of the 1st International Competition on Software Verification. The verification community has performed competitions in various areas in the past, and SVCOMP’12 is the first competition of verification tools that take soft ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
This report describes the definitions, rules, setup, procedure, and results of the 1st International Competition on Software Verification. The verification community has performed competitions in various areas in the past, and SVCOMP’12 is the first competition of verification tools that take
SVkNNC: An Algorithm for Improving the Efficiency of kNearest Neighbor
 Proc. 9th Pacific Rim Intl. Conf. Artificial Intelligence (PRICAI ’06
, 2006
"... Abstract. This paper proposes SVkNNC, a new algorithm for kNearest Neighbor (kNN). This algorithm consists of three steps. First, Support Vector Machines (SVMs) are applied to select some important training data. Then, kmean clustering is used to assign the weight to each training instance. Final ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Abstract. This paper proposes SVkNNC, a new algorithm for kNearest Neighbor (kNN). This algorithm consists of three steps. First, Support Vector Machines (SVMs) are applied to select some important training data. Then, kmean clustering is used to assign the weight to each training instance
SetBased Variational Methods in Credal Networks: the SV2U Algorithm
"... Abstract. Graphical models that represent uncertainty through sets of probability measures are often referred to as credal networks. Polynomialtime exact inference methods are available only for polytreestructured binary credal networks. In this work, we approximate potentially intractable inferen ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
inferences in multiconnected binary networks by tractable inferences in polytreestructures. We propose a novel setbased structural variational inference method the SV2U algorithm. The SV2U algorithm is the first method that produces approximate inferences in large binary credal networks with theoretical
Results 1  10
of
302