### Table 4: The value of Test Err at the minima of different criteria for fixed C values, for SVM L2 soft- margin formulation. The values in parentheses are the corresponding logarithms of

### Table 1: Comparison of the two learning algorithms Hieron and SVM with uneven margins for OBIE using three overall performance measures

2006

"... In PAGE 6: ...Table 1: Comparison of the two learning algorithms Hieron and SVM with uneven margins for OBIE using three overall performance measures Table1 presents the experimental results for comparing the two learning algorithms SVM and Hieron. We used three measures: conventional micro-averaged flat F1 (%), and the two ontology-sensitive augmented F1 (%) based respectively on the BDM and LA, BDM F1 and LA F1, which were dis- cussed in Section 3.... ..."

Cited by 8

### Table 15: The numbers of support vectors (\SVs quot;) obtained from the 1-norm soft margin SVM, 2-norm soft margin SVM without and with C learned jointly that were proposed in (Lanckriet et al., 2004b). These numbers are averaged over 30 random partitions. The total number of data points in the training set and the C values are also shown. The columns with title \PCT quot; show the percentage of support vectors over the training set. SM1 SM2

### Table 1: Comparison with Soft Margin and Maximal Margin Parzen Pima

2000

"... In PAGE 5: ... We also used bold driv- ing (Bishop 1995) to speed up the convergence. Table1 lists the errors for various settings of a109 and a35 . For the synthetic data the error is the average true Bayes error over ten data sets, where the optimal parameter was chosen for each data set separately.... ..."

Cited by 13

### Table 1: Comparison with Soft Margin and Maximal Margin Parzen

2000

"... In PAGE 5: ... We also used bold driv- ing (Bishop 1995) to speed up the convergence. Table1 lists the errors for various settings of and C. For the synthetic data the error is the average true Bayes error over ten data sets, where the optimal parameter was chosen for each data set separately.... ..."

Cited by 13

### Table 1: Comparison with Soft Margin and Maximal Margin Parzen

2000

"... In PAGE 5: ... We also used bold driv- ing (Bishop 1995) to speed up the convergence. Table1 lists the errors for various settings of and C. For the synthetic data the error is the average true Bayes error over ten data sets, where the optimal parameter was chosen for each data set separately.... ..."

Cited by 13

### Table 4: The initial kernel matrices fKig5 i=1 are Gaussian kernels with = 0:01; 0:1; 1; 10; 100 respectively. For c we used c = P i trace(Ki)+trace(In). f i;+g5 i=1 are the average weights of the optimal kernel matrix P i i;+Ki for a 2-norm soft margin SVM with i 0 and tuning of C. The average C-value is given as well. The test set accuracies (TSA) of the optimal 2-norm soft margin SVM with tuning of C (SM2,C) and the best crossvalidation soft margin SVM with RBF kernel (best c/v RBF) are reported.

2004

"... In PAGE 34: ....08/1.25/1.41 Table 3: See the caption to Table 1 for explanation. in Table4 |averages over 30 randomizations in 80% training and 20% test sets. The test set accuracies obtained for P i i;+Ki are competitive with those for the best soft margin SVM with an RBF kernel, tuned using cross-validation.... ..."

Cited by 168

### Table 1: Compare the four algorithms on Reuters-21578 dataset. In the table quot;SVMUM quot; refer to the SVM with uneven margins and quot;j-trick quot; for the SVM with j-trick.

2003

"... In PAGE 6: ... We applied the SVM with uneven margins and other three algorithms to the feature vectors. The results are presented in Table1 . First of all, the SVM with uneven margins gave better results than the SVM and the SVM with j-trick, in particular for the small categories.... ..."

Cited by 5

### Table 8: Comparison of Hard and Soft-Switching for an SVM inverter Federal Urban Driving Schedule: Hard-Switched RA-94 Inverter, SVM Test Number 1 2 3 4

"... In PAGE 7: ...able 7: HWFET Drive Cycle Test Results for Modulation Algorithm Change.............. 68 Table8 : Comparison of Hard and Soft-Switching for an SVM inverter .... ..."

### Table 1: Compare the four algorithms on Reuters-21578 dataset. In the table \SVMUM quot; refer to the SVM with uneven margins and \j-trick quot; for the SVM with j-trick.

"... In PAGE 6: ... We applied the SVM with uneven margins and other three algorithms to the feature vectors. The results are presented in Table1 . First of all, the SVM with uneven margins gave better results than the SVM and the SVM with j-trick, in particular for the small categories.... ..."