Results 1 
7 of
7
Top 10 algorithms in data mining
, 2007
"... Abstract This paper presents the top 10 data mining algorithms identified by the IEEE International Conference on Data Mining (ICDM) in December 2006: C4.5, kMeans, SVM, Apriori, EM, PageRank, AdaBoost, kNN, Naive Bayes, and CART. These top 10 algorithms are among the most influential data mining a ..."
Abstract

Cited by 113 (2 self)
 Add to MetaCart
(Show Context)
Abstract This paper presents the top 10 data mining algorithms identified by the IEEE International Conference on Data Mining (ICDM) in December 2006: C4.5, kMeans, SVM, Apriori, EM, PageRank, AdaBoost, kNN, Naive Bayes, and CART. These top 10 algorithms are among the most influential data mining algorithms in the research community. With each algorithm, we provide a description of the algorithm, discuss the impact of the algorithm, and review current and further research on the algorithm. These 10 algorithms cover classification,
Handling possibilistic labels in pattern classification using Evidential reasoning
, 2000
"... A category of learning problems is considered, in which the class membership of training patterns is assessed by an expert and encoded in the form of a possibility distribution. Each example i thus consists in a feature vector x i and a possibilistic label (u i 1 ; : : : ; u i c ), where u k de ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
A category of learning problems is considered, in which the class membership of training patterns is assessed by an expert and encoded in the form of a possibility distribution. Each example i thus consists in a feature vector x i and a possibilistic label (u i 1 ; : : : ; u i c ), where u k denotes the possibility of that example belonging to class k. This problem is tackled in the framework of Evidence Theory. The evidential distancebased classier previously introduced by one of the authors is extended to handle possibilistic labeling of training data. Two approaches are proposed, based either on the transformation of each possibility distribution into a consonant belief function, or on the use of generalized belief structures with fuzzy focal elements. In each case, a belief function modeling the expert's beliefs concerning the class membership of each new pattern is obtained. This information may then be either interpreted by a human operator to support decisionmaking, or ...
NeuroFuzzy Systems for Intelligent Scientific Computation
 In Proc. ANNIE '95
, 1995
"... Intelligence has been envisioned as a key component of future problem solving environments for scientific computing. This paper describes a computationally intelligent approach to address a major problem in scientific computation, i.e., the efficient solution of partial differential equations (PDEs) ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
(Show Context)
Intelligence has been envisioned as a key component of future problem solving environments for scientific computing. This paper describes a computationally intelligent approach to address a major problem in scientific computation, i.e., the efficient solution of partial differential equations (PDEs). This approach is implemented in PYTHIA  a system that supports smart parallel PDE solvers. PYTHIA provides advice on what method and parameters to use for the solution of a specific PDE problem. It achieves this by comparing the characteristics of the given PDE with those of previously observed classes of PDEs. An important step in the reasoning mechanism of PYTHIA is the categorization of PDE problems into classes based on their characteristics. Exemplar based reasoning systems and backpropagation style neural networks have been earlier used to this end. In this paper, we describe the use of fuzzy minmax neural networks to realize the same objective. This method converges faster, is mor...
Fuzzy Set Theoretic Adjustment to Training Set Class Labels Using Robust Location Measures
"... Fuzzy class label adjustment is a classification preprocessing strategy that compensates for the possible imprecision of class labels. Using training vectors, robust measures of location and dispersion are computed for each class center. Based on distances from these centers, fuzzy sets are construc ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Fuzzy class label adjustment is a classification preprocessing strategy that compensates for the possible imprecision of class labels. Using training vectors, robust measures of location and dispersion are computed for each class center. Based on distances from these centers, fuzzy sets are constructed that determine the degree to which each input vector belongs to each class. These membership values are then used to adjust class labels for the training vectors. This strategy is evaluated using a multilayer perceptron and two different robust location measures for the discrimination of meteorological storm events and is shown to improve the performance of the underlying classifier.
A Fuzzy Generalized Nearest Prototype Classifier
, 1997
"... We propose a Fuzzy Generalized Nearest Prototype Classifier (FGNPC). The classification decision is crisp and is based on aggregation of similarities between the unlabeled object x and n p prototypes fv i g with "soft" labels. FGNPC contains as special cases the 1nearest neighbor rule, t ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
We propose a Fuzzy Generalized Nearest Prototype Classifier (FGNPC). The classification decision is crisp and is based on aggregation of similarities between the unlabeled object x and n p prototypes fv i g with "soft" labels. FGNPC contains as special cases the 1nearest neighbor rule, the minimumdistance classifier, and some types of radialbasis function networks and fuzzy ifthen systems. An experimental illustration is also presented. 1 Introduction The nearest prototype classifier (NPC) is one of the simplest and most intuitively pleasing pattern classification paradigms [3, 6]. Let L = f1; 2; : : : ; cg be a set of class labels and V = fv 1 ; : : : ; vnp g be a set of prototypes, v i 2 ! d ; 1 i n p . We call any function D : ! d ! L a crisp classifier. The classical NPC [6] assumes that n p c, and the prototypes V are crisply labeled to the classes, i.e., a set of crisp class labels I V = fl 1 ; : : : ; l np g; l i 2 L; is associated with V . A vector x 2 ! d ...
Design of Classifier for Detection of Diabetes using Neural Network and Fuzzy kNearest Neighbor Algorithm
"... Diabetes Mellitus is one of the growing vitally fatal diseases worldwide. A design of classifier for the detection of Diabetes Mellitus with optimal cost and precise performance is the need of the age. The current project implementation looks further to train self organizing neural networks and app ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Diabetes Mellitus is one of the growing vitally fatal diseases worldwide. A design of classifier for the detection of Diabetes Mellitus with optimal cost and precise performance is the need of the age. The current project implementation looks further to train self organizing neural networks and apply fuzzy logic to effectively classify a diabetic patient as such. Neural networks are so chosen due to their dynamic nature of learning and future application of knowledge. Fuzzy logic allows partial membership and rule base that allows direct mapping between human thinking and machine results. The proposed method here uses a neural network implementation of the fuzzy knearest neighbor algorithm for designing of classifier. The system is to be run on small mobile devices to facilitate mobility to the user while the processing is to be done on a server machine.
a r t i c l e i n f o Article history:
, 2012
"... research field for many researchers in pattern recognition and machine learning [4,100] and the study and development ngs to sup training se reference. Here, a pattern x follows the usual definition x x1; x2;...; xd;xf g, where d is the number of attributes t scribe the data and x is its assigned c ..."
Abstract
 Add to MetaCart
(Show Context)
research field for many researchers in pattern recognition and machine learning [4,100] and the study and development ngs to sup training se reference. Here, a pattern x follows the usual definition x x1; x2;...; xd;xf g, where d is the number of attributes t scribe the data and x is its assigned class. The general definition of the NN rule in supervised classification, the k nearest neighbors classifier (kNN), consid use of the most similar (nearest) k patterns in TR to derive the class of a test pattern. More formally, let xi be a training