Results 1 - 10
of
17,817
A New Approximation Method of the Quadratic Discriminant Function
- Lecture Notes in Computer Science
, 2000
"... . For many statistical pattern recognition methods, distributions of sample vectors are assumed to be normal, and the quadratic discriminant function derived from the probability density function of multivariate normal distribution is used for classification. However, the computational cost is O(n ..."
Abstract
-
Cited by 3 (0 self)
- Add to MetaCart
. For many statistical pattern recognition methods, distributions of sample vectors are assumed to be normal, and the quadratic discriminant function derived from the probability density function of multivariate normal distribution is used for classification. However, the computational cost is O
Graphical Lasso Quadratic Discriminant Function for Character Recognition
"... Abstract. The quadratic discriminant function (QDF) derived from the multivariate Gaussian distribution is effective for classification in many pattern recognition tasks. In particular, a variant of QDF, called MQDF, has achieved great success and is widely recognized as the state-of-the-art method ..."
Abstract
- Add to MetaCart
Abstract. The quadratic discriminant function (QDF) derived from the multivariate Gaussian distribution is effective for classification in many pattern recognition tasks. In particular, a variant of QDF, called MQDF, has achieved great success and is widely recognized as the state-of-the-art method
Kernel Modified Quadratic Discriminant Function for Facial Expression Recognition
"... Abstract. The Modified Quadratic Discriminant Function was first pro-posed by Kimura et al to improve the performance of Quadratic Dis-criminant Function, which can be seen as a dot-product method by eigen-decompostion of the covariance matrix of each class. Therefore, it is possible to expand MQDF ..."
Abstract
- Add to MetaCart
Abstract. The Modified Quadratic Discriminant Function was first pro-posed by Kimura et al to improve the performance of Quadratic Dis-criminant Function, which can be seen as a dot-product method by eigen-decompostion of the covariance matrix of each class. Therefore, it is possible to expand MQDF
EFFECTS OF SAMPLE SIZE RATIO ON THE PERFORMANCE OF THE QUADRATIC DISCRIMINANT FUNCTION
"... _pxp. The ith group conditional density fi(Xi, θi) is given by This study investigated the performance of the heteroscedastic discriminant function under the non-optimal condition of unbalanced group representation in the populations. The asymptotic performance of the classification function with re ..."
Abstract
- Add to MetaCart
_pxp. The ith group conditional density fi(Xi, θi) is given by This study investigated the performance of the heteroscedastic discriminant function under the non-optimal condition of unbalanced group representation in the populations. The asymptotic performance of the classification function
SAMPLE-SEPARATION-MARGIN BASED MINIMUM CLASSIFICATION ERROR TRAINING OF PATTERN CLASSIFIERS WITH QUADRATIC DISCRIMINANT FUNCTIONS
"... In this paper, we present a new approach to minimum classification error (MCE) training of pattern classifiers with quadratic discriminant functions. First, a so-called sample separation margin (SSM) is defined for each training sample and then used to define the misclassification measure in MCE for ..."
Abstract
- Add to MetaCart
In this paper, we present a new approach to minimum classification error (MCE) training of pattern classifiers with quadratic discriminant functions. First, a so-called sample separation margin (SSM) is defined for each training sample and then used to define the misclassification measure in MCE
The use of the area under the ROC curve in the evaluation of machine learning algorithms
- PATTERN RECOGNITION
, 1997
"... In this paper we investigate the use of the area under the receiver operating characteristic (ROC) curve (AUC) as a performance measure for machine learning algorithms. As a case study we evaluate six machine learning algorithms (C4.5, Multiscale Classifier, Perceptron, Multi-layer Perceptron, k-Ne ..."
Abstract
-
Cited by 685 (3 self)
- Add to MetaCart
-Nearest Neighbours, and a Quadratic Discriminant Function) on six "real world " medical diagnostics data sets. We compare and discuss the use of AUC to the more conventional overall accuracy and find that AUC exhibits a number of desirable properties when compared to overall accuracy: increased
Fisher Discriminant Analysis With Kernels
, 1999
"... A non-linear classification technique based on Fisher's discriminant is proposed. The main ingredient is the kernel trick which allows the efficient computation of Fisher discriminant in feature space. The linear classification in feature space corresponds to a (powerful) non-linear decision f ..."
Abstract
-
Cited by 503 (18 self)
- Add to MetaCart
A non-linear classification technique based on Fisher's discriminant is proposed. The main ingredient is the kernel trick which allows the efficient computation of Fisher discriminant in feature space. The linear classification in feature space corresponds to a (powerful) non-linear decision
Exploiting Generative Models in Discriminative Classifiers
- In Advances in Neural Information Processing Systems 11
, 1998
"... Generative probability models such as hidden Markov models provide a principled way of treating missing information and dealing with variable length sequences. On the other hand, discriminative methods such as support vector machines enable us to construct flexible decision boundaries and often resu ..."
Abstract
-
Cited by 551 (9 self)
- Add to MetaCart
result in classification performance superior to that of the model based approaches. An ideal classifier should combine these two complementary approaches. In this paper, we develop a natural way of achieving this combination by deriving kernel functions for use in discriminative methods such as support
Object Detection with Discriminatively Trained Part Based Models
"... We describe an object detection system based on mixtures of multiscale deformable part models. Our system is able to represent highly variable object classes and achieves state-of-the-art results in the PASCAL object detection challenges. While deformable part models have become quite popular, their ..."
Abstract
-
Cited by 1422 (49 self)
- Add to MetaCart
, their value had not been demonstrated on difficult benchmarks such as the PASCAL datasets. Our system relies on new methods for discriminative training with partially labeled data. We combine a margin-sensitive approach for data-mining hard negative examples with a formalism we call latent SVM. A latent SVM
Regularized discriminant analysis
- J. Amer. Statist. Assoc
, 1989
"... Linear and quadratic discriminant analysis are considered in the small sample high-dimensional setting. Alternatives to the usual maximum likelihood (plug-in) estimates for the covariance matrices are proposed. These alternatives are characterized by two parameters, the values of which are customize ..."
Abstract
-
Cited by 468 (2 self)
- Add to MetaCart
Linear and quadratic discriminant analysis are considered in the small sample high-dimensional setting. Alternatives to the usual maximum likelihood (plug-in) estimates for the covariance matrices are proposed. These alternatives are characterized by two parameters, the values of which
Results 1 - 10
of
17,817