Results 1  10
of
5,329
Regularization Theory and Neural Networks Architectures
 Neural Computation
, 1995
"... We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial Ba ..."
Abstract

Cited by 395 (32 self)
 Add to MetaCart
We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known Radial
Reconstruction and Representation of 3D Objects with Radial Basis Functions
 Computer Graphics (SIGGRAPH ’01 Conf. Proc.), pages 67–76. ACM SIGGRAPH
, 2001
"... We use polyharmonic Radial Basis Functions (RBFs) to reconstruct smooth, manifold surfaces from pointcloud data and to repair incomplete meshes. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. Fast methods for fitting and evaluating RBFs al ..."
Abstract

Cited by 505 (1 self)
 Add to MetaCart
We use polyharmonic Radial Basis Functions (RBFs) to reconstruct smooth, manifold surfaces from pointcloud data and to repair incomplete meshes. An object's surface is defined implicitly as the zero set of an RBF fitted to the given surface data. Fast methods for fitting and evaluating RBFs
Efficient Reconstruction of Large Scattered Geometric Datasets Using the Partition of Unity and Radial Basis Functions
, 2004
"... We present a new scheme for the reconstruction of large geometric data. It is based on the wellknown radial basis function model combined with an adaptive spatial and functional subdivision associated with a family of functions forming a partition of unity. This combination offers robust and effi ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
We present a new scheme for the reconstruction of large geometric data. It is based on the wellknown radial basis function model combined with an adaptive spatial and functional subdivision associated with a family of functions forming a partition of unity. This combination offers robust
A training algorithm for optimal margin classifiers
 PROCEEDINGS OF THE 5TH ANNUAL ACM WORKSHOP ON COMPUTATIONAL LEARNING THEORY
, 1992
"... A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of classifiaction functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters is adjust ..."
Abstract

Cited by 1865 (43 self)
 Add to MetaCart
A training algorithm that maximizes the margin between the training patterns and the decision boundary is presented. The technique is applicable to a wide variety of classifiaction functions, including Perceptrons, polynomials, and Radial Basis Functions. The effective number of parameters
Gradient projection for sparse reconstruction: Application to compressed sensing and other inverse problems
 IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING
, 2007
"... Many problems in signal processing and statistical inference involve finding sparse solutions to underdetermined, or illconditioned, linear systems of equations. A standard approach consists in minimizing an objective function which includes a quadratic (squared ℓ2) error term combined with a spa ..."
Abstract

Cited by 539 (17 self)
 Add to MetaCart
sparsenessinducing (ℓ1) regularization term.Basis pursuit, the least absolute shrinkage and selection operator (LASSO), waveletbased deconvolution, and compressed sensing are a few wellknown examples of this approach. This paper proposes gradient projection (GP) algorithms for the bound
Priors, Stabilizers and Basis Functions: from regularization to radial, tensor and additive splines
, 1993
"... We had previously shown that regularization principles lead to approximation schemes which are equivalent to networks with one layer of hidden units, called Regularization Networks. In particular we had discussed how standard smoothness functionals lead to a subclass of regularization networks, th ..."
Abstract

Cited by 86 (14 self)
 Add to MetaCart
, the wellknown Radial Basis Functions approximation schemes. In this paper weshow that regularization networks encompass amuch broader range of approximation schemes, including many of the popular general additivemodels and some of the neural networks. In particular weintroduce new classes of smoothness
Training Support Vector Machines: an Application to Face Detection
, 1997
"... We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision sur ..."
Abstract

Cited by 727 (1 self)
 Add to MetaCart
We investigate the application of Support Vector Machines (SVMs) in computer vision. SVM is a learning technique developed by V. Vapnik and his team (AT&T Bell Labs.) that can be seen as a new method for training polynomial, neural network, or Radial Basis Functions classifiers. The decision
Convolution Kernels on Discrete Structures
, 1999
"... We introduce a new method of constructing kernels on sets whose elements are discrete structures like strings, trees and graphs. The method can be applied iteratively to build a kernel on an infinite set from kernels involving generators of the set. The family of kernels generated generalizes the fa ..."
Abstract

Cited by 506 (0 self)
 Add to MetaCart
the family of radial basis kernels. It can also be used to define kernels in the form of joint Gibbs probability distributions. Kernels can be built from hidden Markov random elds, generalized regular expressions, pairHMMs, or ANOVA decompositions. Uses of the method lead to open problems involving
A tutorial on support vector machines for pattern recognition
 Data Mining and Knowledge Discovery
, 1998
"... The tutorial starts with an overview of the concepts of VC dimension and structural risk minimization. We then describe linear Support Vector Machines (SVMs) for separable and nonseparable data, working through a nontrivial example in detail. We describe a mechanical analogy, and discuss when SV ..."
Abstract

Cited by 3393 (12 self)
 Add to MetaCart
large (even infinite) VC dimension by computing the VC dimension for homogeneous polynomial and Gaussian radial basis function kernels. While very high VC dimension would normally bode ill for generalization performance, and while at present there exists no theory which shows that good generalization
MicroRNA genes are transcribed by RNA polymerase II
 EMBO J
, 2004
"... MicroRNAs (miRNAs) constitute a large family of noncoding RNAs that function as guide molecules in diverse gene silencing pathways. Current efforts are focused on the regulatory function of miRNAs, while little is known about how these unusual genes themselves are regulated. Here we present the fir ..."
Abstract

Cited by 491 (8 self)
 Add to MetaCart
MicroRNAs (miRNAs) constitute a large family of noncoding RNAs that function as guide molecules in diverse gene silencing pathways. Current efforts are focused on the regulatory function of miRNAs, while little is known about how these unusual genes themselves are regulated. Here we present
Results 1  10
of
5,329