Results 1  10
of
226
Regularization Theory and Neural Networks Architectures
 Neural Computation
, 1995
"... We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Ba ..."
Abstract

Cited by 395 (32 self)
 Add to MetaCart
to different classes of basis functions. Additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. Furthermore, the same generalization that extends Radial Basis Functions (RBF) to Hyper Basis Functions (HBF) also leads from additive models
Comparing Support Vector Machines with Gaussian Kernels to Radial Basis Function Classifiers
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 1997
"... The Support Vector (SV) machine is a novel type of learning machine, based on statistical learning theory, which contains polynomial classifiers, neural networks, and radial basis function (RBF) networks as special cases. In the RBF case, the SV algorithm automatically determines centers, weights an ..."
Abstract

Cited by 183 (13 self)
 Add to MetaCart
The Support Vector (SV) machine is a novel type of learning machine, based on statistical learning theory, which contains polynomial classifiers, neural networks, and radial basis function (RBF) networks as special cases. In the RBF case, the SV algorithm automatically determines centers, weights
Extreme learning machine: RBF network case
 in Proc. 8th Int. Conf. Control, Autom., Robot., Vis. (ICARCV 2004
"... Abstract – A new learning algorithm called extreme learning machine (ELM) has recently been proposed for singlehidden layer feedforward neural networks (SLFNs) to easily achieve good generalization performance at extremely fast learning speed. ELM randomly chooses the input weights and analytically ..."
Abstract

Cited by 20 (9 self)
 Add to MetaCart
and analytically determines the output weightsofSLFNs.ThispapershowsthatELMcanbe extended to radial basis function (RBF) network case, which allows the centers and impact widths of RBF kernels to be randomly generated and the output weights to be simply analytically calculated instead of iteratively tuned
Boosting the Performance of RBF Networks with Dynamic Decay Adjustment
 Advances in Neural Information Processing Systems
, 1995
"... Radial Basis Function (RBF) Networks, also known as networks of locallytuned processing units (see [6]) are well known for their ease of use. Most algorithms used to train these types of networks, however, require a fixed architecture, in which the number of units in the hidden layer must be deter ..."
Abstract

Cited by 42 (8 self)
 Add to MetaCart
Radial Basis Function (RBF) Networks, also known as networks of locallytuned processing units (see [6]) are well known for their ease of use. Most algorithms used to train these types of networks, however, require a fixed architecture, in which the number of units in the hidden layer must
On the Influence of the Kernel on the Consistency of Support Vector Machines
 Journal of Machine Learning Research
, 2001
"... In this article we study the generalization abilities of several classifiers of support vector machine (SVM) type using a certain class of kernels that we call universal. It is shown that the soft margin algorithms with universal kernels are consistent for a large class of classification problems ..."
Abstract

Cited by 212 (21 self)
 Add to MetaCart
problems including some kind of noisy tasks provided that the regularization parameter is chosen well. In particular we derive a simple su#cient condition for this parameter in the case of Gaussian RBF kernels. On the one hand our considerations are based on an investigation of an approximation property
Combination Methods for Ensembles of RBF Networks
"... Abstract: Building an ensemble of classifiers is an useful way to improve the performance. In the case of neural networks the bibliography has centered on the use of Multilayer Feedforward (MF). However, there are other interesting networks like Radial Basis Functions (RBF) that can be used as elem ..."
Abstract
 Add to MetaCart
Abstract: Building an ensemble of classifiers is an useful way to improve the performance. In the case of neural networks the bibliography has centered on the use of Multilayer Feedforward (MF). However, there are other interesting networks like Radial Basis Functions (RBF) that can be used
RBFbased Image Restoration Utilising Auxiliary Points
"... Utilisation of Radial Basis Functions (RBF) for reconstruction of damaged images became common technique nowadays. This paper deals with computation and utilisation of auxiliary points in order to further increase the ability of RBF to restore damaged areas in image. Our goal was to achieve the best ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Utilisation of Radial Basis Functions (RBF) for reconstruction of damaged images became common technique nowadays. This paper deals with computation and utilisation of auxiliary points in order to further increase the ability of RBF to restore damaged areas in image. Our goal was to achieve
Automatic Basis Selection Techniques for RBF Networks
"... This paper proposes a generic criterion that defines the optimum number of basis functions for radial basis function neural networks. The generalization performance of an RBF network relates to its prediction capability on independent test data. This performance gives a measure of the quality of the ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
This paper proposes a generic criterion that defines the optimum number of basis functions for radial basis function neural networks. The generalization performance of an RBF network relates to its prediction capability on independent test data. This performance gives a measure of the quality
Active Learning The Weights Of A Rbf Network
, 1995
"... We describe a principled strategy to sample functions optimally for function approximation tasks. The strategy works within a Bayesian framework and uses ideas from optimal experiment design to evaluate the potential utility of new data points. We consider an application of this general framework fo ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
for active learning the weight coefficients of a Gaussian Radial Basis Function (RBF) network. We also derive some sufficiency conditions on the learning problem for which there are analytical solutions to the data sampling procedure. 1 Introduction In most classical formulations of learning from examples
RBF Networks with Mixed Radial Basis Functions
"... After the introduction to neural network technology as multivariable function approximation, radial basis function (RBF) networks have been studied in many different aspects in recent years. From the theoretical viewpoint, approximation and uniqueness of the interpolation is studied and it has been ..."
Abstract
 Add to MetaCart
filtering effect the RBF networks are not favourable for high frequencies unless relatively high number of hidden nodes is used. Therefore, for approximations that have only low frequency components, RBF networks provide satisfactory results and this is presumably the case in many favourable RBF
Results 1  10
of
226