Results 1 - 10
of
31,930
Table 1. Comparison of classification accuracies for the 20 processes with different classifiers and feature sets. LDA = linear discriminant analysis, QDA = quadratic discriminant analysis, MD = Mahalanobis discrimination, kNN = k-nearest neighbours classifier and LVQ = learning vector quantization
2005
"... In PAGE 6: ... The parameter value 2200 was selected because it seems to yield good classification results for all data sets. 5 Results The classifiers were tested with the eight data sets, and the results for the test data are shown in Table1 . The percentages in the cells indicate the ratios of correctly classified processes; the cells left empty indicate invalid classifier - feature set combinations.... ..."
Cited by 1
Table 2. Comparison of the classification accuracy for the 11 HWH processes with dif- ferent classifiers and feature sets using features extracted from the voltage and current signals. LDA = linear discriminant analysis, QDA = quadratic discriminant analysis, Mahalanobis = Mahalanobis discrimination, LVQ = learning vector quantization and KNN = k nearest neighbors.
2006
Table 5.11 Analysis of Similarity Between Gaussian and Quadratic Discriminants.
Table 4. Comparison of the classification accuracy for the 20 processes with different classifiers and feature sets. LDA = linear discriminant analysis, QDA = quadratic dis- criminant analysis, Mahalanobis = Mahalanobis discrimination, LVQ = learning vector quantization and KNN = k nearest neighbours classifier, pc = principal component.
2006
"... In PAGE 49: ... The difference compared to the previous tests was that this time the feature set was extended with five and ten principal components formed from the original features. The results of these tests are displayed in Table4 . The kNN-classifier using three closest neighbours and the ten means of the signal intervals again outperformed the other classifier, with a classification accuracy of 98.... ..."
Table 1. Detection reliability of model based steganalysis with different JPEG quan- tisation. For the reliability the absolute value is decisive. We present signed values in the table where it might be interesting to see between which JPEG qualities the sign change occurs, i. e., where the model fits best.
"... In PAGE 4: ...4 (Table 5 gives an overview of all feature sets used throughout the paper.) For each quality, two classifiers (linear and quadratic discriminant analysis) were trained with 60 images and then applied to another 570 images resulting in the detection reliabilities shown in Table1 . The best result is achieved with the quadratic discriminant analysis, which is almost independent of JPEG quality and embedding method MB1 or MB2.... ..."
Table 2: Training errors (top row) and test errors (bottom row) for di erent examples. Values are mean (standard errors) over 5 simulations. Dataset LDA QDA Max Max/thresh Coupled Coupled/thresh
"... In PAGE 16: ... Table2 shows the error rates for the three class problem and a number of other datasets. The classi ers used are: LDA | linear discriminant analysis QDA | quadratic discriminant analysis Max | the rule ~ d from (9) Max/thresh | the rule ~ d with threshold optimization.... ..."
Table 1: Comparison between the four studied methods.
in Off
"... In PAGE 5: ... Four methods have been compared: two statistical ones, Linear Discriminant Analysis18 and Quadratic Discriminant Analysis, and two neural ones, Learning Vector Quantisation and MultiLayer Perceptron. Table1 shows that the MLP gives the best results. It is a neural network with 15 inputs (which corresponds to an angular sector superior to the beam aperture of the sensor) with 19 neurones in the hidden layer.... ..."
Table 2: Classification error rates
2003
"... In PAGE 16: ... We compared the performance of MSVM with 10-fold CV with that of the linear discriminant analysis (LDA), the quadratic discriminant analysis (QDA), and the nearest neighbor (NN) method. Table2 is a summary of the comparison results in terms of the classification error rates. For wine and glass, the error rates represent the average of the misclassification rates cross validated over 10-splits.... ..."
Table 11. Discriminant Analysis
"... In PAGE 8: ... Table11 shows that the relative code churn measures have effective discriminant ability (comparable to prior studies done on industrial software [13]). We conclude that relative code churn measures can be used to discriminate between fault and not fault-prone binaries (H4).... ..."
Results 1 - 10
of
31,930