Results 1  10
of
1,518,716
LeastSquares Policy Iteration
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2003
"... We propose a new approach to reinforcement learning for control problems which combines valuefunction approximation with linear architectures and approximate policy iteration. This new approach ..."
Abstract

Cited by 461 (12 self)
 Add to MetaCart
We propose a new approach to reinforcement learning for control problems which combines valuefunction approximation with linear architectures and approximate policy iteration. This new approach
Least Median of Squares Regression
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 1984
"... ..."
A scheme for robust distributed sensor fusion based on average consensus
 PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON INFORMATION PROCESSING IN SENSOR NETWORKS (IPSN
, 2005
"... We consider a network of distributed sensors, where each sensor takes a linear measurement of some unknown parameters, corrupted by independent Gaussian noises. We propose a simple distributed iterative scheme, based on distributed average consensus in the network, to compute the maximumlikelihoo ..."
Abstract

Cited by 250 (3 self)
 Add to MetaCart
compute a local weighted leastsquares estimate, which converges to the global maximumlikelihood solution. This scheme is robust to unreliable communication links. We show that it works in a network with dynamically changing topology, provided that the infinitely occurring communication graphs
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 594 (53 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias
A spacetime diffusion scheme for peertopeer leastsquares estimation
 Proc. of IPSN
, 2006
"... We consider a sensor network in which each sensor takes measurements, at various times, of some unknown parameters, corrupted by independent Gaussian noises. Each node can take a finite or infinite number of measurements, at arbitrary times (i.e., asynchronously). We propose a spacetime diffusion s ..."
Abstract

Cited by 29 (0 self)
 Add to MetaCart
each node’s state by a weighted average of its current value and locally available data: new measurements for the time update, and neighbors ’ data for the spatial update. At any time, any node can compute a local weighted leastsquares estimate of the unknown parameters, which converges to the global
Benchmarking Least Squares Support Vector Machine Classifiers
 NEURAL PROCESSING LETTERS
, 2001
"... In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set of eq ..."
Abstract

Cited by 446 (46 self)
 Add to MetaCart
In Support Vector Machines (SVMs), the solution of the classification problem is characterized by a (convex) quadratic programming (QP) problem. In a modified version of SVMs, called Least Squares SVM classifiers (LSSVMs), a least squares cost function is proposed so as to obtain a linear set
The approximation power of moving leastsquares
 Math. Comp
, 1998
"... Abstract. A general method for nearbest approximations to functionals on Rd, using scattereddata information is discussed. The method is actually the moving leastsquares method, presented by the BackusGilbert approach. It is shown that the method works very well for interpolation, smoothing and ..."
Abstract

Cited by 159 (7 self)
 Add to MetaCart
Abstract. A general method for nearbest approximations to functionals on Rd, using scattereddata information is discussed. The method is actually the moving leastsquares method, presented by the BackusGilbert approach. It is shown that the method works very well for interpolation, smoothing
Direct least Square Fitting of Ellipses
, 1998
"... This work presents a new efficient method for fitting ellipses to scattered data. Previous algorithms either fitted general conics or were computationally expensive. By minimizing the algebraic distance subject to the constraint 4ac  b² = 1 the new method incorporates the ellipticity constraint ..."
Abstract

Cited by 421 (3 self)
 Add to MetaCart
This work presents a new efficient method for fitting ellipses to scattered data. Previous algorithms either fitted general conics or were computationally expensive. By minimizing the algebraic distance subject to the constraint 4ac  b² = 1 the new method incorporates the ellipticity constraint into the normalization factor. The proposed method combines several advantages: (i) It is ellipsespecific so that even bad data will always return an ellipse; (ii) It can be solved naturally by a generalized eigensystem and (iii) it is extremely robust, efficient and easy to implement.
NORMALISED LEASTSQUARES ESTIMATION IN . . .
, 2006
"... We investigate the timevarying ARCH (tvARCH) process. It is shown that it can be used to describe the slow decay of the sample autocorrelations of the squared returns often observed in financial time series, which warrants the further study of parameter estimation methods for the model. Since the p ..."
Abstract
 Add to MetaCart
the parameters are changing over time, a successful estimator needs to perform well for small samples. We propose a kernel normalisedleastsquares (kernelNLS) estimator which has a closed form, and thus outperforms the previously proposed kernel quasimaximum likelihood (kernelQML) estimator for small
A PERFORMANCE EVALUATION OF LOCAL DESCRIPTORS
, 2005
"... In this paper we compare the performance of descriptors computed for local interest regions, as for example extracted by the HarrisAffine detector [32]. Many different descriptors have been proposed in the literature. However, it is unclear which descriptors are more appropriate and how their perfo ..."
Abstract

Cited by 1752 (53 self)
 Add to MetaCart
In this paper we compare the performance of descriptors computed for local interest regions, as for example extracted by the HarrisAffine detector [32]. Many different descriptors have been proposed in the literature. However, it is unclear which descriptors are more appropriate and how
Results 1  10
of
1,518,716