Results 1 -
3 of
3
Optimizing kernel alignment by data translation in feature space
- In ICASSP
, 2008
"... Kernel-target alignment is commonly used to predict the behavior of reproducing kernels in a classification context, without training any kernel machine. In this paper, we show that a poor position of training data in feature space can drastically reduce the value of alignment. This implies that, in ..."
Abstract
-
Cited by 3 (0 self)
- Add to MetaCart
(Show Context)
Kernel-target alignment is commonly used to predict the behavior of reproducing kernels in a classification context, without training any kernel machine. In this paper, we show that a poor position of training data in feature space can drastically reduce the value of alignment. This implies that, in a kernel selection setting, the best kernel of a given collection may be characterized by a low alignment. To overcome this situation, we present a gradient ascent algorithm for maximizing the alignment by data translation in feature space. The aim is to reduce the biais introduced by the translation non-invariance of this criterion. Experimental results on multidimensional benchmarks show the effectiveness of our approach. Index Terms — kernel alignment, data translation, SVM 1.
Multiclass Feature Selection with Kernel Gram-matrix-based criteria
, 2012
"... Feature selection has been an important issue during the last decades to determine the most relevant features according to a given classification problem. Numerous methods emerged that take into account Support Vector Machines in the selection process. Such approaches are powerful but often complex ..."
Abstract
-
Cited by 3 (0 self)
- Add to MetaCart
Feature selection has been an important issue during the last decades to determine the most relevant features according to a given classification problem. Numerous methods emerged that take into account Support Vector Machines in the selection process. Such approaches are powerful but often complex and costly. In this paper, we propose new feature selection methods based on two criteria designed for the optimization of SVM: Kernel Target Alignment and Kernel Class Separability. We demonstrate how these two measures, when fully expressed, can build efficient and simple methods, easily applicable to multiclass problems, and iteratively computable with minimal memory requirements. An extensive experimental study is conducted both on artificial and real-world data sets to compare the proposed methods to state of the art feature selection algorithms. The results demonstrate the relevance of the proposed methods both in terms of performance and computational cost.
COMPARISON OF DIFFERENT STRATEGIES FOR A SVM-BASED AUDIO SEGMENTATION
"... We compare in this paper diverse hierarchical and multi-class approaches for the speech/music segmentation task, based on Support Vector Machines, combined with a median filter post-processing. We show the efficiency of kernel tuning through the novel Kernel Target Alignment criterion. Quantitative ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
(Show Context)
We compare in this paper diverse hierarchical and multi-class approaches for the speech/music segmentation task, based on Support Vector Machines, combined with a median filter post-processing. We show the efficiency of kernel tuning through the novel Kernel Target Alignment criterion. Quantitative results provide an F-measure of 96.9%, that represents an error reduction of about 50 % compared to the results gathered by the French ESTER evaluation campaign. We also show the relevance of the SVM with very low feature vector dimension on this task. 1.