Results 1  10
of
47
Probabilistic Outputs for Support Vector Machines and Comparisons to Regularized Likelihood Methods
 ADVANCES IN LARGE MARGIN CLASSIFIERS
, 1999
"... The output of a classifier should be a calibrated posterior probability to enable postprocessing. Standard SVMs do not provide such probabilities. One method to create probabilities is to directly train a kernel classifier with a logit link function and a regularized maximum likelihood score. Howev ..."
Abstract

Cited by 1051 (0 self)
 Add to MetaCart
sigmoid versus a kernel method trained with a regularized likelihood error function. These methods are tested on three dataminingstyle data sets. The SVM+sigmoid yields probabilities of comparable quality to the regularized maximum likelihood kernel method, while still retaining the sparseness
Predicting Good Probabilities with Supervised Learning
 In Proc. Int. Conf. on Machine Learning (ICML
, 2005
"... We examine the relationship between the predictions made by different learning algorithms and true posterior probabilities. We show that maximum margin methods such as boosted trees and boosted stumps push probability mass away from 0 and 1 yielding a characteristic sigmoid shaped distortion i ..."
Abstract

Cited by 89 (7 self)
 Add to MetaCart
We examine the relationship between the predictions made by different learning algorithms and true posterior probabilities. We show that maximum margin methods such as boosted trees and boosted stumps push probability mass away from 0 and 1 yielding a characteristic sigmoid shaped distortion
Highest probability SVM nearest neighbor classifier for spam filtering
"... In this paper we evaluate the performance of the highest probability SVM nearest neighbor (HPSVMNN) classifier, which combines the ideas of the SVM and kNN classifiers, on the task of spam filtering. To classify a sample, the HPSVMNN classifier does the following: for each k in a predefined set ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
set {k1,..., kN} it trains an SVM model on k nearest labeled samples, uses this model to classify the given sample, and transforms the output of SVM into posterior probabilities of the classes using sigmoid approximation; than it selects that of the 2 × N resulting answers which has the highest
Advance Probabilistic Binary Decision Tree using SVM
"... The probabilistic decision tree to an actual diagnosis database is in progress, where the performance of the probabilistic decision tree is tested in view of the size of the databases and the difficulties is that it implies for processing them. Here proposed an algorithm Advance Probabilistic Binary ..."
Abstract
 Add to MetaCart
and sigmoid function to map the SVM output into probabilities. Using APBDTSVM classification accuracy can be improved and trainingtesting time can be reduced.
Incorporating the Boltzmann Prior in Object Detection Using SVM
"... In this paper we discuss object detection when only a small number of training examples are given. Specifically, we show how to incorporate a simple prior on the distribution of natural images into support vector machines. SVMs are known to be robust to overfitting; however, a few training examples ..."
Abstract
 Add to MetaCart
that the corresponding positive half space will have a low probability to contain natural images (the background). Our experiments on real data sets show that the resulting detector is more robust to the choice of training examples, and substantially improves both linear and kernel SVM when trained on 10 positive and 10
Incremental Learning with SVM for Multimodal Classification of Prostatic Adenocarcinoma
"... Robust detection of prostatic cancer is a challenge due to the multitude of variants and their representation in MR images. We propose a pattern recognition system with an incremental learning ensemble algorithm using support vector machines (SVM) tackling this problem employing multimodal MR images ..."
Abstract
 Add to MetaCart
and rotation invariant local phase quantization (RILPQ) were utilized to quantify texture information. An incremental learning ensemble SVM was implemented to suit working conditions in medical applications and to improve effectiveness and robustness of the system. The probability estimation of cancer
SVM+BiHMM: A Hybrid Statistic Model for Metadata Extraction
"... Abstract: This paper proposes SVM+BiHMM, a hybrid statistic model of metadata extraction based on SVM (support vector machine) and BiHMM (bigram HMM (hidden Markov model)). The BiHMM model modifies the HMM model with both Bigram sequential relation and position information of words, by means of dist ..."
Abstract
 Add to MetaCart
probability adjusted by the Sigmoid function of SVM score, and the transition probability trained by Bigram HMM. The SVM classifier benefits from the structure patterns of document line data while the Bigram HMM considers both words ’ Bigram sequential relation and position information, so the complementary
Unbiased SVM Density Estimation with Application to Graphical Pattern Recognition
"... Abstract. Classification of structured data (i.e., data that are represented as graphs) is a topic of interest in the machine learning community. This paper presents a different, simple approach to the problem of structured pattern recognition, relying on the description of graphs in terms of algebr ..."
Abstract
 Add to MetaCart
of algebraic binary relations. Maximumaposteriori decision rules over relations require the estimation of classconditional probability density functions (pdf) defined on graphs. A nonparametric technique for the estimation of the pdfs is introduced, on the basis of a factorization of joint probabilities
Categorization by Learning and Combining Object Parts
, 2001
"... We describe an algorithm for automatically learning discriminative components of objects with SVM classifiers. It is based on growing image parts by minimizing theoretical bounds on the error probability of an SVM. Componentbased face classifiers are then combined in a second stage to yield a h ..."
Abstract

Cited by 75 (21 self)
 Add to MetaCart
We describe an algorithm for automatically learning discriminative components of objects with SVM classifiers. It is based on growing image parts by minimizing theoretical bounds on the error probability of an SVM. Componentbased face classifiers are then combined in a second stage to yield a
Prediction of probability of survival in critically ill patients optimizing the Area Under the ROC Curve
, 2007
"... The paper presents a support vector method for estimating probabilities in a real world problem: the prediction of probability of survival in critically ill patients. The standard procedure with Support Vectors Machines uses Platt’s method to fit a sigmoid that transforms continuous outputs into pro ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
into probabilities. The method proposed here exploits the difference between maximizing the AUC and minimizing the error rate in binary classification tasks. The conclusion is that it is preferable to optimize the AUC first (using a multivariate SVM) to then fit a sigmoid. We provide experimental evidence in favor
Results 1  10
of
47