• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 31,930
Next 10 →

Table 7. Quadratic Discriminant Analysis Accuracy

in Year: 2002/2003
by Prof Dr, Uwe Schmock, Ernst Young Zürich, Student Anca Antonov

Table 1. Comparison of classification accuracies for the 20 processes with different classifiers and feature sets. LDA = linear discriminant analysis, QDA = quadratic discriminant analysis, MD = Mahalanobis discrimination, kNN = k-nearest neighbours classifier and LVQ = learning vector quantization

in Methods for classifying spot welding processes: A comparative study of performance
by Eija Haapalainen, Perttu Laurinen, Heli Junno, Lauri Tuovinen, Juha Röning 2005
"... In PAGE 6: ... The parameter value 2200 was selected because it seems to yield good classification results for all data sets. 5 Results The classifiers were tested with the eight data sets, and the results for the test data are shown in Table1 . The percentages in the cells indicate the ratios of correctly classified processes; the cells left empty indicate invalid classifier - feature set combinations.... ..."
Cited by 1

Table 2. Comparison of the classification accuracy for the 11 HWH processes with dif- ferent classifiers and feature sets using features extracted from the voltage and current signals. LDA = linear discriminant analysis, QDA = quadratic discriminant analysis, Mahalanobis = Mahalanobis discrimination, LVQ = learning vector quantization and KNN = k nearest neighbors.

in A TOP-DOWN APPROACH FOR CREATING AND IMPLEMENTING DATA MINING SOLUTIONS
by Perttu Laurinen, C Technica, Cover Design, Raimo Ahonen 2006

Table 5.11 Analysis of Similarity Between Gaussian and Quadratic Discriminants.

in Analysis And Design Of The Multi-Layer Perceptron Using Polynomial Basis Functions
by Mu-Song Chen

Table 4. Comparison of the classification accuracy for the 20 processes with different classifiers and feature sets. LDA = linear discriminant analysis, QDA = quadratic dis- criminant analysis, Mahalanobis = Mahalanobis discrimination, LVQ = learning vector quantization and KNN = k nearest neighbours classifier, pc = principal component.

in A TOP-DOWN APPROACH FOR CREATING AND IMPLEMENTING DATA MINING SOLUTIONS
by Perttu Laurinen, C Technica, Cover Design, Raimo Ahonen 2006
"... In PAGE 49: ... The difference compared to the previous tests was that this time the feature set was extended with five and ten principal components formed from the original features. The results of these tests are displayed in Table4 . The kNN-classifier using three closest neighbours and the ten means of the signal intervals again outperformed the other classifier, with a classification accuracy of 98.... ..."

Table 1. Detection reliability of model based steganalysis with different JPEG quan- tisation. For the reliability the absolute value is decisive. We present signed values in the table where it might be interesting to see between which JPEG qualities the sign change occurs, i. e., where the model fits best.

in Weaknesses of MB2
by Christian Ullerich, Andreas Westfeld
"... In PAGE 4: ...4 (Table 5 gives an overview of all feature sets used throughout the paper.) For each quality, two classifiers (linear and quadratic discriminant analysis) were trained with 60 images and then applied to another 570 images resulting in the detection reliabilities shown in Table1 . The best result is achieved with the quadratic discriminant analysis, which is almost independent of JPEG quality and embedding method MB1 or MB2.... ..."

Table 2: Training errors (top row) and test errors (bottom row) for di erent examples. Values are mean (standard errors) over 5 simulations. Dataset LDA QDA Max Max/thresh Coupled Coupled/thresh

in Classification by Pairwise Coupling
by Trevor Hastie, Robert Tibshirani
"... In PAGE 16: ... Table2 shows the error rates for the three class problem and a number of other datasets. The classi ers used are: LDA | linear discriminant analysis QDA | quadratic discriminant analysis Max | the rule ~ d from (9) Max/thresh | the rule ~ d with threshold optimization.... ..."

Table 1: Comparison between the four studied methods.

in Off
by Etienne Colle, Christian Barat
"... In PAGE 5: ... Four methods have been compared: two statistical ones, Linear Discriminant Analysis18 and Quadratic Discriminant Analysis, and two neural ones, Learning Vector Quantisation and MultiLayer Perceptron. Table1 shows that the MLP gives the best results. It is a neural network with 15 inputs (which corresponds to an angular sector superior to the beam aperture of the sensor) with 19 neurones in the hidden layer.... ..."

Table 2: Classification error rates

in Multicategory Support Vector Machines, Theory, and Application to the Classification of Microarray Data and Satellite Radiance Data
by Y. Lee, Yoonkyung Lee, Yoonkyung Lee, Yi Lin, Yi Lin, Grace Wahba, Grace Wahba 2003
"... In PAGE 16: ... We compared the performance of MSVM with 10-fold CV with that of the linear discriminant analysis (LDA), the quadratic discriminant analysis (QDA), and the nearest neighbor (NN) method. Table2 is a summary of the comparison results in terms of the classification error rates. For wine and glass, the error rates represent the average of the misclassification rates cross validated over 10-splits.... ..."

Table 11. Discriminant Analysis

in Use of Relative Code Churn Measures to Predict System Defect Density
by unknown authors
"... In PAGE 8: ... Table11 shows that the relative code churn measures have effective discriminant ability (comparable to prior studies done on industrial software [13]). We conclude that relative code churn measures can be used to discriminate between fault and not fault-prone binaries (H4).... ..."
Next 10 →
Results 1 - 10 of 31,930
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University