#### DMCA

## Particle swarm optimization aided orthogonal forward regression for unified data modelling (2010)

Venue: | IEEE TRANS. EVOLUTION. COMPUT |

Citations: | 8 - 4 self |

### Citations

12873 | Statistical Learning Theory - Vapnik - 1998 |

6460 | Neural networks and pattern recognition - Bishop - 1995 |

4768 |
Pattern Classification and Scene Analysis
- Duda, Hart
- 1973
(Show Context)
Citation Context ...tion MODELING from data is of fundamental importancein all walks of engineering. Various data modeling applications can be classified into three categories, namely, regression [1]–[3], classification =-=[4]-=-–[6], and probability density function (PDF) estimation [7]–[9]. In regression, the task is to establish a model that links the observation data to their target function or desired output values. The ... |

3690 | Density estimation for statistics and data analysis - Silverman - 1986 |

3520 | Particle swarm optimization
- Kennedy, Eberhart
- 1995
(Show Context)
Citation Context ...r proposes a particle swarm optimization (PSO) aided OFR algorithm to construct tunable-node RBF models for unified data modeling that includes regression, classification, and density estimation. PSO =-=[70]-=-, [71] constitutes a population based stochastic optimization technique, which was inspired by the social behavior of bird flocks or fish schools. The algorithm commences with random initialization of... |

3326 |
Time Series Analysis: Forecasting and Control
- Box, Jenkins
- 1976
(Show Context)
Citation Context ... regression. I. Introduction MODELING from data is of fundamental importancein all walks of engineering. Various data modeling applications can be classified into three categories, namely, regression =-=[1]-=-–[3], classification [4]–[6], and probability density function (PDF) estimation [7]–[9]. In regression, the task is to establish a model that links the observation data to their target function or des... |

3296 | A tutorial on support vector machines for pattern recognition
- Burges
- 1998
(Show Context)
Citation Context ...resulting RBF model is then further optimized using nonlinear optimization [56]. The SVM and related sparse kernel methods are equally applicable to regression, classification, and density estimation =-=[57]-=-–[62]. The OLS approach has also been extended to all three types of data modeling. In particular, the regularization assisted OLS (ROLS) algorithm based on minimizing the leave-one-out (LOO) MSE [48]... |

2681 | Atomic decomposition by basis pursuit - Chen, Donoho, et al. - 1998 |

1436 |
Pattern recognition and neural networks
- Ripley
- 1996
(Show Context)
Citation Context ... MODELING from data is of fundamental importancein all walks of engineering. Various data modeling applications can be classified into three categories, namely, regression [1]–[3], classification [4]–=-=[6]-=-, and probability density function (PDF) estimation [7]–[9]. In regression, the task is to establish a model that links the observation data to their target function or desired output values. The good... |

1180 |
On estimation of a probability density function and mode
- Parzen
- 1962
(Show Context)
Citation Context ...walks of engineering. Various data modeling applications can be classified into three categories, namely, regression [1]–[3], classification [4]–[6], and probability density function (PDF) estimation =-=[7]-=-–[9]. In regression, the task is to establish a model that links the observation data to their target function or desired output values. The goodness of a Manuscript received January 2, 2009; revised ... |

947 | Sparse Bayesian learning and the relevance vector machine - Tipping - 2001 |

764 | Learning the kernel matrix with semi-definite programming
- Lanckriet, Cristianini, et al.
(Show Context)
Citation Context ...dentified using the orthogonal least squares (OLS) algorithm [44]–[48]. A similar linear learning approach is adopted in the support vector machine (SVM) and other sparse kernel modeling methods [49]–=-=[55]-=-, which fix the kernel centers to the training input data points and adopt a common kernel variance for every kernel. A sparse kernel model is then sought by making as many kernel weights to near zero... |

686 | Learning with Kernels: Support Vector - Schölkopf, Smola - 2001 |

600 |
Fast learning in networks of locally-tuned processing units
- Moody, Darken
- 1989
(Show Context)
Citation Context ...determined via other means, typically based on cross validation. Alternatively, clustering algorithms can be applied to find the RBF center vectors, as well as the associated basis function variances =-=[39]-=-–[42]. This leaves the RBF weights to be determined by the usual linear least squares solution. Again, the number of clusters has to be determined via cross validation. An alternative RBF network sele... |

550 |
Classical and modern regression with applications. PWS-Kent Publishing Company
- Myers
- 1990
(Show Context)
Citation Context ... identify the n-unit RBF network instead, the “test” error of the resulting model can be calculated on the data point removed from training. This LOO modeling error, denoted as e[n,−k]k , is given by =-=[2]-=- e [n,−k] k = e [n] k /η [n] k (27) where η[n]k is the LOO error weighting [2]. The LOO MSE for the n-unit RBF network is then defined by Jn = 1 N N∑ k=1 ( e [n,−k] k )2 . (28) This LOO MSE is a measu... |

453 |
Applied smoothing techniques for data analysis: the kernel approach with S-Plus illustrations
- Bowman, Azzalini
- 1997
(Show Context)
Citation Context ...s of engineering. Various data modeling applications can be classified into three categories, namely, regression [1]–[3], classification [4]–[6], and probability density function (PDF) estimation [7]–=-=[9]-=-. In regression, the task is to establish a model that links the observation data to their target function or desired output values. The goodness of a Manuscript received January 2, 2009; revised Octo... |

422 | Orthogonal least squares learning algorithm for radial basis function networks - Chen, Cowan, et al. - 1991 |

339 | Support vector machines for classification and regression
- Gunn
- 1998
(Show Context)
Citation Context ...lidation. For kernel modeling methods, the learning algorithm’s hyperparameters also have to be determined by cross validation. For example, for the SVM algorithm with the ε insensitive cost function =-=[50]-=-, the kernel variance as well as the regularization and errorband parameters must be specified. To avoid using costly cross validation for determining the RBF variance as is required by the above-ment... |

258 |
Orthogonal least squares methods and their application to non-linear system identification
- Chen, Billings, et al.
- 1989
(Show Context)
Citation Context ...EEE 478 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 14, NO. 4, AUGUST 2010 for every RBF node. A parsimonious RBF network is then identified using the orthogonal least squares (OLS) algorithm =-=[44]-=-–[48]. A similar linear learning approach is adopted in the support vector machine (SVM) and other sparse kernel modeling methods [49]–[55], which fix the kernel centers to the training input data poi... |

186 | K.S.: Self organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients
- Ratnaweera, Halgamuge
- 2004
(Show Context)
Citation Context ...ation Condition Check: If the maximum number of iterations, Imax, is reached, terminate the algorithm with the solution gb(Imax); otherwise, set l = l + 1 and go to Step b). Ratnaweera and co-authors =-=[73]-=- reported that using a time varying acceleration coefficient (TVAC) enhances the performance of PSO. We adopt this mechanism, in which c1 is reduced from 2.5 to 0.5 and c2 varies from 0.5 to 2.5 durin... |

134 | A clustering technique for digital communications channel equalization using radial basis function networks - Chen, Mulgrew, et al. - 1993 |

84 | Kernel matching pursuit - Vincent, Bengio |

81 | Multiplicative updates for nonnegative quadratic programming - Sha, Lin, et al. |

60 | Data clustering using particle swarm optimization
- Merwe, Engelbrecht
- 2003
(Show Context)
Citation Context ...in implementation, ability to rapidly converge to a “reasonably good” solution and to “steer clear” of local minima. It has been successfully applied to wide-ranging optimization problems [37], [38], =-=[72]-=-–[117]. Because of the simplicity and efficiency of the PSO method, the proposed PSO aided OFR algorithm based on LOO statistics for constructing tunable-node RBF models not only produces smaller RBF ... |

55 | Genetic evolution of radial basis function coverage using orthogonal niches - Whitehead - 1996 |

55 | Probability density estimation from optimally condensed data samples
- Girolami, He
- 2003
(Show Context)
Citation Context ...ting RBF model is then further optimized using nonlinear optimization [56]. The SVM and related sparse kernel methods are equally applicable to regression, classification, and density estimation [57]–=-=[62]-=-. The OLS approach has also been extended to all three types of data modeling. In particular, the regularization assisted OLS (ROLS) algorithm based on minimizing the leave-one-out (LOO) MSE [48] offe... |

55 | Adaptive particle swarm optimization - Zhan, Zhang, et al. - 2009 |

50 | Support vector method for multivariate density estimation
- Vapnik, Mukherjee
- 2000
(Show Context)
Citation Context ... and to adopt the EDF calculated using the training data as the desired response for the unknown CDF of the PDF p(x) to be estimated, as in the various fixed-kernel density estimation methods of [59]–=-=[61]-=-, [130], [131]. The true CDF of the PDF p(x) is defined as F (x) = ∫ x −∞ p(u) du (74) 494 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 14, NO. 4, AUGUST 2010 and the CDF associated with kernel... |

48 | Sparse modeling using orthogonal forward regression with PRESS statistic and regularization
- Chen, Hong, et al.
- 2004
(Show Context)
Citation Context ...78 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 14, NO. 4, AUGUST 2010 for every RBF node. A parsimonious RBF network is then identified using the orthogonal least squares (OLS) algorithm [44]–=-=[48]-=-. A similar linear learning approach is adopted in the support vector machine (SVM) and other sparse kernel modeling methods [49]–[55], which fix the kernel centers to the training input data points a... |

46 | Sparse kernel regression modeling using combined locally regularized orthogonal least squares and D-optimality experimental design - Chen, Hong, et al. - 2003 |

45 | Support vector density estimation
- Weston, Gammerman, et al.
(Show Context)
Citation Context ... CDFs and to adopt the EDF calculated using the training data as the desired response for the unknown CDF of the PDF p(x) to be estimated, as in the various fixed-kernel density estimation methods of =-=[59]-=-–[61], [130], [131]. The true CDF of the PDF p(x) is defined as F (x) = ∫ x −∞ p(u) du (74) 494 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 14, NO. 4, AUGUST 2010 and the CDF associated with k... |

42 | Recursive hybrid algorithm for non-linear system identification using radial basis function networks. lnt - Ckn, Billings, et al. - 1992 |

41 | Radial Basis Function Networks for Classifying Process Faults - Leonard, Kramer - 1991 |

40 | Combined genetic algorithm optimization and regularized orthogonal least squares learning for radial basis function networks - Chen, Wu, et al. - 1999 |

36 | El-Hawary, “A survey of particle swarm optimization applications in electric power operations - AlRashidi, E - 2006 |

32 |
Three learning phases for radial-basis-function networks
- Schwenker, Kestler, et al.
- 2001
(Show Context)
Citation Context ...tworks has also been proposed. For example, an initial RBF network can be constructed using a linear learning method and the resulting RBF model is then further optimized using nonlinear optimization =-=[56]-=-. The SVM and related sparse kernel methods are equally applicable to regression, classification, and density estimation [57]–[62]. The OLS approach has also been extended to all three types of data m... |

29 |
Parallel recursive prediction error algorithm for training layered neural networks
- Chen, Cowan, et al.
- 1990
(Show Context)
Citation Context ...or covariance matrices of its hidden nodes, as well as the weights that connect the RBF nodes to the network output, can be trained together via nonlinear optimization using gradient based algorithms =-=[27]-=-–[31], the expectation-maximization (EM) algorithm [32], [33], or various evolutionary algorithms [34]–[38]. Generally speaking, learning based on such a nonlinear approach is computationally expensiv... |

27 | Multiobjective evolutionary optimization of the size, shape, and position parameters of radial basis function networks for function approximation - González, Rojas, et al. - 2003 |

27 | Particle Swarm Optimization Methods for Pattern Recognition and - Omran - 2004 |

25 |
Evolving Space-Filling Curves to Distribute Radial Basis Functions Over an Input Space
- Whitehead, Choate
- 1994
(Show Context)
Citation Context ...etwork output, can be trained together via nonlinear optimization using gradient based algorithms [27]–[31], the expectation-maximization (EM) algorithm [32], [33], or various evolutionary algorithms =-=[34]-=-–[38]. Generally speaking, learning based on such a nonlinear approach is computationally expensive and may encounter the problem of local minima. Additionally, the network structure or the number of ... |

24 | An improved radial basis function network for visual autonomous road following - Rosenblum, Davis - 1996 |

24 |
Particle swarm optimisers for cluster formation in wireless sensor networks
- Guru, Halgamuge, et al.
- 2005
(Show Context)
Citation Context ...inertia weight, rand() denotes the uniform random number between 0 and 1, and c1 and c2 are the two acceleration coefficients. In order to avoid excessive roaming of particles beyond the search space =-=[75]-=-, a velocity space m ′∏ j=1 Vj = m ′∏ j=1 [−Vj,max, Vj,max] (22) is imposed on v(l+1)i so that If (v(l+1)i |j > Vj,max) v(l+1)i |j = Vj,max; If (v(l+1)i |j < −Vj,max) v(l+1)i |j = −Vj,max; where v|j d... |

23 | The determination of multivariable nonlinear models for dynamical systems - Billings, Chen - 1998 |

23 | Data classification with radial basis function networks based on a novel kernel density estimation algorithm - Oyang, Hwang, et al. - 2005 |

23 | Estimation of elliptical basis function parameters by the EM algorithm with application to speaker verification
- Mak, Kung
- 2000
(Show Context)
Citation Context ...ights that connect the RBF nodes to the network output, can be trained together via nonlinear optimization using gradient based algorithms [27]–[31], the expectation-maximization (EM) algorithm [32], =-=[33]-=-, or various evolutionary algorithms [34]–[38]. Generally speaking, learning based on such a nonlinear approach is computationally expensive and may encounter the problem of local minima. Additionally... |

23 | Experiments with repeating weighted boosting search for optimization in signal processing applications
- Chen, Wang, et al.
- 2005
(Show Context)
Citation Context ...covariance matrix of a RBF node by maximizing the correlation criterion is a nonlinear and nonconvex optimization task, a global search algorithm known as the repeated weighted boosting search (RWBS) =-=[67]-=- is employed to perform this optimization. This RBF regression does not need to learn any hyperparameter. However, it is required to fit a diagonal RBF covariance matrix to every training data point, ... |

21 | A Stochastic Model for the Optimal Operation of a Wind-Thermal Power System - Pappala, Erlich, et al. |

20 |
Nonlinear Systems Identification Using Radial Basis Functions
- Chen, Billing, et al.
- 1990
(Show Context)
Citation Context ...ession framework can be adopted for all three classes of data modeling problems. The radial basis function (RBF) network has found wideranging data modeling applications in diverse engineering fields =-=[10]-=-–[26]. The parameters of the RBF network, which include the center vectors and variances or covariance matrices of its hidden nodes, as well as the weights that connect the RBF nodes to the network ou... |

20 | A hybrid linear/nonlinear training algorithm for feedforward neural networks - McLoone, Brown, et al. |

18 | Nonlinear time series modeling and prediction using Gaussian RBF networks with enhanced clustering and - Chen - 1995 |

18 | A hybrid particle swarm optimization–back-propagation algorithm for feedforward neural network training, Applied Mathematics and Computation - Zhang, Zhang, et al. |

17 | Particle-swarm-optimization-based multiuser detector for cdma communications - SOO, SIU, et al. - 2007 |

16 | RBFN restoration of nonlinearly degraded images - Cha, Kassam - 1996 |

16 | Particle swarm optimization training algorithm for ANNs in stage prediction of Shing Mun River - Chau - 2006 |

16 | PSO-based multiobjective optimization with dynamic population size and adaptive local archives - Leong, Yen - 2008 |

14 | Training reformulated radial basis function neural networks capable of identifying uncertainty in data classification - Karayiannis, Xiong - 2006 |

13 | Guzelis C. Automatic detection of epileptiform events in EEG by a three stage based on artificial neural networks - Acir, Oztura, et al. |

13 |
Robust maximum likelihood training of heteroscedastic probabilistic neural networks
- Yang, Chen
- 1998
(Show Context)
Citation Context ...the weights that connect the RBF nodes to the network output, can be trained together via nonlinear optimization using gradient based algorithms [27]–[31], the expectation-maximization (EM) algorithm =-=[32]-=-, [33], or various evolutionary algorithms [34]–[38]. Generally speaking, learning based on such a nonlinear approach is computationally expensive and may encounter the problem of local minima. Additi... |

13 |
An orthogonal forward regression technique for sparse kernel density estimation, Neurocomputing 71 (4–6
- Chen, Hong, et al.
- 2008
(Show Context)
Citation Context ...n rate can be computed efficiently, just as in the case of the LOO MSE for regression, and this ensures a fast RBF classifier construction. A sparse density estimation technique has been developed in =-=[64]-=-, which uses the ROLS algorithm based on the LOO MSE to select a parsimonious density estimate and computes the associated kernel weights using the multiplicative nonnegative quadratic programming (MN... |

13 | Scheduling of demand side resources using binary particle swarm optimization - Pedrasa, Spooner, et al. - 2009 |

10 | Radial basis function network architecture for nonholonomic motion planning and control of free-flying manipulators - Gorinevsky, Kapitanovsky, et al. |

10 | Structured parameter optimization method for the radial basis function-based state-dependent autoregressive model
- Peng, Ozaki, et al.
(Show Context)
Citation Context ...variance matrices of its hidden nodes, as well as the weights that connect the RBF nodes to the network output, can be trained together via nonlinear optimization using gradient based algorithms [27]–=-=[31]-=-, the expectation-maximization (EM) algorithm [32], [33], or various evolutionary algorithms [34]–[38]. Generally speaking, learning based on such a nonlinear approach is computationally expensive and... |

9 | An Adaptive Neurocontroller Using RBFN for Robot Manipulators - Lee, Choi - 2004 |

9 | Comparative aspects of neural network algorithms for on-line modeling of dynamic processes - An, Brown, et al. - 1993 |

9 |
Self-generation RBFNs using evolutional PSO learning
- Feng
(Show Context)
Citation Context ... simplicity in implementation, ability to rapidly converge to a “reasonably good” solution and to “steer clear” of local minima. It has been successfully applied to wide-ranging optimization problems =-=[37]-=-, [38], [72]–[117]. Because of the simplicity and efficiency of the PSO method, the proposed PSO aided OFR algorithm based on LOO statistics for constructing tunable-node RBF models not only produces ... |

8 | A tutorial on support vector regression,” Royal Holloway - Smola, Schölkopf - 1998 |

8 | Workpiece dynamic analysis and prediction during chatter of turning process - Adam, Firpi, et al. - 2008 |

8 | A particle swarm optimization-based multiuser detection for receive-diversity-aided STBC systems - Liu, Li |

8 | Quality Disturbance Classification Using Fuzzy C-Means Algorithm and Adaptive Particle Swarm Optimization - Biswal, Dash, et al. |

8 | Harmonic Minimization in Multilevel Inverters Using Modified Species-Based Particle Swarm Optimization - Hagh, Taghizadeh, et al. - 2009 |

7 | Mapping ocean sediments by RBF networks - Caiti, Parisini - 1994 |

7 | An attempt for coloring multichannel MR imaging data - Muraki, Nakai, et al. - 2001 |

7 |
Multi-step ahead nonlinear identification of Lorenz‟s chaotic system using radial basis neural network with learning by clustering and particle swarm optimization
- Guerraa, Coelho
(Show Context)
Citation Context ...k output, can be trained together via nonlinear optimization using gradient based algorithms [27]–[31], the expectation-maximization (EM) algorithm [32], [33], or various evolutionary algorithms [34]–=-=[38]-=-. Generally speaking, learning based on such a nonlinear approach is computationally expensive and may encounter the problem of local minima. Additionally, the network structure or the number of RBF n... |

6 | A neural-network approach for semiconductor wafer post-sawing inspection - SU, YANG, et al. - 2002 |

6 | Robust neuro-H controller design for aircraft auto-landing - Li, Sundararajan, et al. - 2004 |

6 | Coded modulation assisted radial basis function aided turbo equalization for dispersive rayleigh-fading channels - Ng, Yee, et al. - 2004 |

6 | Adequate determination of a band of wavelet threshold for noise cancellation using particle swarm optimization - Sun, Liu, et al. |

6 | A hybrid of cooperative particle swarm optimization and cultural algorithm for neural fuzzy networks and its prediction applications - Lin, Chen, et al. - 1982 |

6 | Moghaddam, “Determination of capacity benefit margin in multiarea power systems using particle swarm optimization - Ramezani, Haghifam, et al. - 2009 |

5 | Sensitivity analysis applied to the construction of radial basis function networks
- Shi, Yeung, et al.
- 2005
(Show Context)
Citation Context ...ed by the usual linear least squares solution. Again, the number of clusters has to be determined via cross validation. An alternative RBF network selection criterion is based on sensitivity analysis =-=[43]-=-. However, one of the most popular approaches for constructing RBF models for regression is to formulate the problem as a linear learning problem by considering the training input data points as candi... |

5 |
A fast linear-in-the-parameters classifier construction algorithm using orthogonal forward selection to minimize leave-one-out misclassification rate
- Hong, Chen, et al.
(Show Context)
Citation Context ...DF estimate construction, CPSO−OFR does not include the complexity of the MNQP algorithm for updating the model weights but this complexity is small and can be neglected. The ROLS-LOO algorithm [48], =-=[63]-=-, [64] is an efficient construction algorithm for selecting fixed-node RBF models. 484 IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, VOL. 14, NO. 4, AUGUST 2010 Fig. 1. Engine data set (a) input υk ,... |

5 | The application of particle swarm optimization to passive and hybrid active power filter design - He, Xu, et al. - 2009 |

5 | Integrated hybrid-PSO and fuzzy-NN decoupling control for temperature of reheating furnace - Liao, She, et al. - 2009 |

4 | Radial basis function networks for contingency analysis of bulk power systems - Refaee, Mohandes, et al. - 1999 |

4 |
Iterative reference adjustment for high-precision and repetitive motion control applications
- Tan, Zhao, et al.
- 2005
(Show Context)
Citation Context ...n framework can be adopted for all three classes of data modeling problems. The radial basis function (RBF) network has found wideranging data modeling applications in diverse engineering fields [10]–=-=[26]-=-. The parameters of the RBF network, which include the center vectors and variances or covariance matrices of its hidden nodes, as well as the weights that connect the RBF nodes to the network output,... |

4 |
Construction of RBF classifiers with tunable units using orthogonal forward selection based tion rate
- Chen, Hong, et al.
- 2006
(Show Context)
Citation Context ...mputationally costly, particularly for a large training data set. More effective construction algorithms for the RBF network with tunable nodes for regression and classification are proposed in [68], =-=[69]-=-, where each RBF unit has a tunable center vector as well as an adjustable diagonal covariance matrix. An orthogonal forward regression (OFR) procedure is employed to optimize the RBF units one by one... |

4 | A functional-link-based fuzzy neural network for temperature control - Chen, Lin, et al. |

4 | Dynamic multiple swarms in multiobjective particle swarm optimization - Yen, Leong - 2009 |

4 | Recurrent functional-link-based fuzzy-neuralnetwork-controlled induction generator system using improved particle swarm optimization - Lin, Teng, et al. - 2009 |

3 |
Clustering-based algorithms for single-hidden-layer sigmoid perceptron
- Uykan
- 2003
(Show Context)
Citation Context ...mined via other means, typically based on cross validation. Alternatively, clustering algorithms can be applied to find the RBF center vectors, as well as the associated basis function variances [39]–=-=[42]-=-. This leaves the RBF weights to be determined by the usual linear least squares solution. Again, the number of clusters has to be determined via cross validation. An alternative RBF network selection... |

3 | Orthogonal forward selection for constructing the radial basis function network with tunable nodes
- Chen, Hong, et al.
- 2005
(Show Context)
Citation Context ... be computationally costly, particularly for a large training data set. More effective construction algorithms for the RBF network with tunable nodes for regression and classification are proposed in =-=[68]-=-, [69], where each RBF unit has a tunable center vector as well as an adjustable diagonal covariance matrix. An orthogonal forward regression (OFR) procedure is employed to optimize the RBF units one ... |

3 | Identification and control of nonlinear systems by a dissimilation particle swarm optimization-based Elman neural network. Nonlinear Analysis - Ge, Qian, et al. |

3 | Thinned Planar Array Design Using Boolean PSO With Velocity Mutation - Deligkaris, Zaharis, et al. - 2009 |

3 | An improved comprehensive learning particle swarm optimization and its application to the semiautomatic design of antennas - Wu, Geng, et al. - 2009 |

3 | Design of RF Window Using Multi-objective Particle Swarm Optimization - Chauhan, Kartikeyan, et al. |

2 | Identification of Nonlinear Systems Using Generalized Kernel Models
- Chen, Hong, et al.
- 2005
(Show Context)
Citation Context ...ified. To avoid using costly cross validation for determining the RBF variance as is required by the above-mentioned linear learning approach for fixed-node RBF or kernel modeling methods, Chen et al =-=[66]-=- adopt a strategy of fitting a diagonal covariance matrix to each candidate RBF node, which as usual is centered at a training input point, by optimizing the correlation criterion between the training... |

2 | A PSO-based subtractive clustering technique for designing RBF neural networks - Chen, Qin, et al. - 2008 |

2 | Beamforming in the presence of mutual coupling based on constrained particle swarm optimization - Demarcke, Rogier, et al. - 2009 |

1 | Adaptive acquisition and tracking for deep space array feed antennas - Mukai, Vilnrotter, et al. - 2002 |

1 | Resolving superimposed MUAPs using particle swarm optimization - Marateb, McGill - 2009 |