Results 11  20
of
435
Pattern analysis for machine olfaction: a review
 IEEE Sens. J
, 2002
"... Abstract—Pattern analysis constitutes a critical building block in the development of gas sensor array instruments capable of detecting, identifying, and measuring volatile compounds, a technology that has been proposed as an artificial substitute of the human olfactory system. The successful design ..."
Abstract

Cited by 59 (10 self)
 Add to MetaCart
(Show Context)
Abstract—Pattern analysis constitutes a critical building block in the development of gas sensor array instruments capable of detecting, identifying, and measuring volatile compounds, a technology that has been proposed as an artificial substitute of the human olfactory system. The successful design of a pattern analysis system for machine olfaction requires a careful consideration of the various issues involved in processing multivariate data: signalpreprocessing, feature extraction, feature selection, classification, regression, clustering, and validation. A considerable number of methods from statistical pattern recognition, neural networks, chemometrics, machine learning, and biological cybernetics has been used to process electronic nose data. The objective of this review paper is to provide a summary and guidelines for using the most widely used pattern analysis techniques, as well as to identify research directions that are at the frontier of sensorbased machine olfaction. Index Terms—Classification, clustering, dimensionality reduction, electronic nose, multicomponent analysis, pattern analysis, preprocessing, validation. I.
Face recognition with radial basis function (RBF) neural networks
 IEEE Transactions on Neural Networks
, 2002
"... Abstract—A general and efficient design approach using a radial basis function (RBF) neural classifier to cope with small training sets of high dimension, which is a problem frequently encountered in face recognition, is presented in this paper. In order to avoid overfitting and reduce the computati ..."
Abstract

Cited by 51 (2 self)
 Add to MetaCart
(Show Context)
Abstract—A general and efficient design approach using a radial basis function (RBF) neural classifier to cope with small training sets of high dimension, which is a problem frequently encountered in face recognition, is presented in this paper. In order to avoid overfitting and reduce the computational burden, face features are first extracted by the principal component analysis (PCA) method. Then, the resulting features are further processed by the Fisher’s linear discriminant (FLD) technique to acquire lowerdimensional discriminant patterns. A novel paradigm is proposed whereby data information is encapsulated in determining the structure and initial parameters of the RBF neural classifier before learning takes place. A hybrid learning algorithm is used to train the RBF neural networks so that the dimension of the search space is drastically reduced in the gradient paradigm. Simulation results conducted on the ORL database show that the system achieves excellent performance both in terms of error rates of classification and learning efficiency. Index Terms—Face recognition, Fisher’s linear discriminant, ORL database, principal component analysis, radial basis function (RBF) neural networks, small training sets of high dimension. I.
Sparse modelling using orthogonal forward regression with press statistic and regularization
 IEEE TRANS. SYSTEMS, MAN AND CYBERNETICS, PART B
, 2004
"... The paper introduces an efficient construction algorithm for obtaining sparse linearintheweights regression models based on an approach of directly optimizing model generalization capability. This is achieved by utilizing the delete1 cross validation concept and the associated leaveoneout tes ..."
Abstract

Cited by 49 (23 self)
 Add to MetaCart
The paper introduces an efficient construction algorithm for obtaining sparse linearintheweights regression models based on an approach of directly optimizing model generalization capability. This is achieved by utilizing the delete1 cross validation concept and the associated leaveoneout test error also known as the predicted residual sums of squares (PRESS) statistic, without resorting to any other validation data set for model evaluation in the model construction process. Computational efficiency is ensured using an orthogonal forward regression, but the algorithm incrementally minimizes the PRESS statistic instead of the usual sum of the squared training errors. A local regularization method can naturally be incorporated into the model selection procedure to further enforce model sparsity. The proposed algorithm is fully automatic, and the user is not required to specify any criterion to terminate the model construction procedure. Comparisons with some of the existing stateofart modeling methods are given, and several examples are included to demonstrate the ability of the proposed algorithm to effectively construct sparse models that generalize well.
Combined Genetic Algorithm Optimization and Regularized Orthogonal Least Squares Learning for Radial Basis Function Networks
, 1999
"... The paper presents a twolevel learning method for radial basis function (RBF) networks. A regularized orthogonal least squares (ROLS) algorithm is employed at the lower level to construct RBF networks while the two key learning parameters, the regularization parameter and the RBF width, are optimiz ..."
Abstract

Cited by 40 (14 self)
 Add to MetaCart
The paper presents a twolevel learning method for radial basis function (RBF) networks. A regularized orthogonal least squares (ROLS) algorithm is employed at the lower level to construct RBF networks while the two key learning parameters, the regularization parameter and the RBF width, are optimized using a genetic algorithm (GA) at the upper level. Nonlinear time series modeling and prediction is used as an example to demonstrate the effectiveness of this hierarchical learning approach.
Local Regularization Assisted Orthogonal Least Squares Regression
 IEEE Transactions on Neural Networks, submitted
, 2001
"... A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsimonious or sparse regression models that generalize well. By associating each orthogonal weight in the regression model with an individual regularization parameter, the ability for the orthogonal least ..."
Abstract

Cited by 36 (10 self)
 Add to MetaCart
(Show Context)
A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsimonious or sparse regression models that generalize well. By associating each orthogonal weight in the regression model with an individual regularization parameter, the ability for the orthogonal least squares (OLS) model selection to produce a very sparse model with good generalization performance is greatly enhanced. Furthermore, with the assistance of local regularization, when to terminate the model selection procedure becomes much clearer. This LROLS algorithm has computational advantages over the recently introduced relevance vector machine (RVM) method. Keywords Orthogonal least squares algorithm, regularization, regression, support vector machines, relevance vector machines. I.
A Neural Network Primer
, 1994
"... Neural networks are composed of basic units somewhat analogous to neurons. These units are linked to each other by connections whose strength is modifiable as a result of a learning process or algorithm. Each of these units integrates independently (in parallel) the information provided by its sy ..."
Abstract

Cited by 36 (8 self)
 Add to MetaCart
Neural networks are composed of basic units somewhat analogous to neurons. These units are linked to each other by connections whose strength is modifiable as a result of a learning process or algorithm. Each of these units integrates independently (in parallel) the information provided by its synapses in order to evaluate its state of activation. The unit response is then a linear or nonlinear function of its activation. Linear algebra concepts are used, in general, to analyze linear units, with eigenvectors and eigenvalues being the core concepts involved. This analysis makes clear the strong similarity between linear neural networks and the general linear model developed by statisticians. The linear models presented here are the perceptron, and the linear associator. The behavior of nonlinear networks can be described within the framework of optimization and approximation techniques with dynamical systems (e.g., like those used to model spin glasses). One of the main notio...
Median Radial Basis Functions Neural Network
 IEEE Trans. on Neural Networks
, 1996
"... Radial Basis Functions (RBF) consists of a twolayer neural network, where each hidden unit implements a kernel function. Each kernel is associated with an activation region from the input space and its output is fed to an output unit. In order to find the parameters of a neural network which embeds ..."
Abstract

Cited by 34 (18 self)
 Add to MetaCart
(Show Context)
Radial Basis Functions (RBF) consists of a twolayer neural network, where each hidden unit implements a kernel function. Each kernel is associated with an activation region from the input space and its output is fed to an output unit. In order to find the parameters of a neural network which embeds this structure we take into consideration two different statistical approaches. The first approach uses classical estimation in the learning stage and it is based on the learning vector quantization algorithm and its second order statistics extension. After the presentation of this approach, we introduce the Median Radial Basis Functions (MRBF) algorithm based on robust estimation of the hidden unit parameters. The proposed algorithm employs the marginal median for kernel location estimation and the median of the absolute deviations for the scale parameter estimation. A histogrambased fast implementation is provided for the MRBF algorithm. The theoretical performance of the two training al...
Representation of functional data in neural networks
 NEUROCOMPUTIO 64 (2005) 183210
, 2005
"... Functional data analysis (FDA) is an extension of tradional data analysis tofunctiFA; data, for example spectra, temporalserira spatilF;;fifiPFA iFii gesturerecognijT/ data, etc. FunctiFA; data are rarely knowni practififi usually a regular oriFfiPxTLF sampliL i known. For thi reason, someprocessiF ..."
Abstract

Cited by 33 (14 self)
 Add to MetaCart
(Show Context)
Functional data analysis (FDA) is an extension of tradional data analysis tofunctiFA; data, for example spectra, temporalserira spatilF;;fifiPFA iFii gesturerecognijT/ data, etc. FunctiFA; data are rarely knowni practififi usually a regular oriFfiPxTLF sampliL i known. For thi reason, someprocessiF i neededi order to benefit from the smooth character offunctiFA; datai theanalysi methods.Thi paper shows how to extend the radiTfiPPFA; functiP networks (RBFN) andmultij/FA; perceptron (MLP) models to functiPFA dataitaF.T i partifiIfiF when the latter are known throughliou ofifi/xfi.FA;xfi paiFi Varifi possi.FA;xfix for functi;FA processiA aredi;/jTPFA i;/jTP theprojectiA on smooth bases,functifi/I prictifi component analysit functitF; centerit andreductiFA and the use of diTjxxFA.jx operators. It i shown how toifi;LTPFA.j these functional
An Efficient Method to Construct a Radial Basis Function Neural Network Classifier
, 1997
"... Radial basis function neural network(RBFN) has the power of the universal function approximation. But it is usually not straightforward how to construct an RBFN to solve a given problem. This paper describes a method to construct an RBFN classifier efficiently and effectively. The method determines ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
Radial basis function neural network(RBFN) has the power of the universal function approximation. But it is usually not straightforward how to construct an RBFN to solve a given problem. This paper describes a method to construct an RBFN classifier efficiently and effectively. The method determines the middle layer neurons by a fast clustering algorithm and computes the optimal weights between the middle and the output layers statistically. We applied the proposed method to construct an RBFN classifier for an unconstrained handwritten digit recognition. The experiment showed that the method could construct an RBFN classifier fast and the performance of the classifier was better than the best result previously reported. Keyword : Radial Basis Function, Linear Discriminant Function, Classification, APCIII, Clustering, GRBF, LMS, Handwritten Digit Recognition RBF Neural Network Classifier 2 1 INTRODUCTION Radial basis function neural network(RBFN) (Moody and Darken, 1989; Poggio and G...
Recurrent Least Squares Support Vector Machines
 IEEE Transactions on Circuits and SystemsI
, 2000
"... The method of support vector machines has been developed for solving classication and static function approximation problems. In this paper we introduce support vector machines within the context of recurrent neural networks. Instead of Vapnik's epsilon insensitive loss function, we consider a ..."
Abstract

Cited by 30 (8 self)
 Add to MetaCart
(Show Context)
The method of support vector machines has been developed for solving classication and static function approximation problems. In this paper we introduce support vector machines within the context of recurrent neural networks. Instead of Vapnik's epsilon insensitive loss function, we consider a least squares version related to a cost function with equality constraints for a recurrent network. Essential features of support vector machines remain such as Mercer's condition and the fact that the output weights are a Lagrange multiplier weighted sum of the data points. The solution to recurrent least squares support vector machines is characterized by a set of nonlinear equations. Due to its high computational complexity, we focus on a limit case of assigning the squared error an innitely large penalty factor with early stopping as a form of regularization. The eectiveness of the approach is demonstrated on trajectory learning of the double scroll attractor in Chua's circuit. Keywords. Recurrent neural networks, Support vector machines, Radial basis functions, Double scroll. 1