Results 1  10
of
132
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 594 (53 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, assessing predictions, handling noisy data and outliers, improving the quality of predictions by tuning t parameters, interference between old and new data, implementing locally weighted learning e ciently, and applications of locally weighted learning. A companion paper surveys how locally weighted learning can be used in robot learning and control.
Constructive Incremental Learning from Only Local Information
, 1998
"... ... This article illustrates the potential learning capabilities of purely local learning and offers an interesting and powerful approach to learning with receptive fields. ..."
Abstract

Cited by 206 (39 self)
 Add to MetaCart
... This article illustrates the potential learning capabilities of purely local learning and offers an interesting and powerful approach to learning with receptive fields.
Datadriven Bandwidth Selection in Local Polynomial Fitting: Variable Bandwidth and Spatial Adaption
 Journal of the Royal Statistical Society. Series B
, 1995
"... ..."
Efficient Estimation of Conditional Variance Functions in Stochastic Regression
 Biometrika
, 1998
"... this paper is to derive an ecient fullyadaptive procedure for estimating ..."
Abstract

Cited by 120 (7 self)
 Add to MetaCart
this paper is to derive an ecient fullyadaptive procedure for estimating
Multivariate Local Polynomial Regression For Time Series: Uniform Strong Consistency And Rates
 J. Time Ser. Anal
, 1996
"... Local highorder polynomial fitting is employed for the estimation of the multivariate regression function m (x 1 , . . . , x d ) = E [y (Y d )  X 1 = x 1 , . . . , X d = x d ], and of its partial derivatives, for stationary random processes {Y i , X i }. The function y may be selected to yield est ..."
Abstract

Cited by 116 (2 self)
 Add to MetaCart
Local highorder polynomial fitting is employed for the estimation of the multivariate regression function m (x 1 , . . . , x d ) = E [y (Y d )  X 1 = x 1 , . . . , X d = x d ], and of its partial derivatives, for stationary random processes {Y i , X i }. The function y may be selected to yield estimates of the conditional mean, conditional moments and conditional distributions. Uniform strong consistency over compact subsets of R d , along with rates, are established for the regression function and its partial derivatives for strongly mixing processes. Short Title: Multivariate Regression Estimation. Key Words: Multivariate regression estimation, local polynomial fitting, mixing processes, uniform strong consistency, rates of convergence. AMS (1991) Subject Classification: 62G07, 62H12, 62M09. ################## This work was supported by the Office of Naval Research under Grant N0001490J1175.  2  1. Introduction Let {Y i , X i } i = be jointly stationary processes on...
Functionalcoefficient Regression Models for Nonlinear Time Series
 Journal of the American Statistical Association
, 1998
"... We apply the local linear regression technique for estimation of functionalcoefficient regression models for time series data. The models include threshold autoregressive models (Tong 1990) and functionalcoefficient autoregressive models (Chen and Tsay 1993) as special cases but with the added adv ..."
Abstract

Cited by 81 (15 self)
 Add to MetaCart
(Show Context)
We apply the local linear regression technique for estimation of functionalcoefficient regression models for time series data. The models include threshold autoregressive models (Tong 1990) and functionalcoefficient autoregressive models (Chen and Tsay 1993) as special cases but with the added advantages such as depicting finer structure of the underlying dynamics and better postsample forecasting performance. We have also proposed a new bootstrap test for the goodness of fit of models and a bandwidth selector based on newly defined crossvalidatory estimation for the expected forecasting errors. The proposed methodology is dataanalytic and is of appreciable flexibility to analyze complex and multivariate nonlinear structures without suffering from the "curse of dimensionality". The asymptotic properties of the proposed estimators are investigated under the ffmixing condition. Both simulated and real data examples are used for illustration. Key Words: ffmixing; Asymptotic normali...
On Automatic Boundary Corrections
 Annals of Statistics
, 1996
"... Many popular curve estimators based on smoothing have difficulties caused by boundary effects. These effects are visually disturbing in practice and can play a dominant role in theoretical analysis. Local polynomial regression smoothers are known to correct boundary effects automatically. Some analo ..."
Abstract

Cited by 48 (2 self)
 Add to MetaCart
Many popular curve estimators based on smoothing have difficulties caused by boundary effects. These effects are visually disturbing in practice and can play a dominant role in theoretical analysis. Local polynomial regression smoothers are known to correct boundary effects automatically. Some analogs are implemented for density estimation and the resulting estimators also achieve automatic boundary corrections. In both settings of density and regression estimation, we investigate best weight functions for local polynomial fitting at the endpoints and find a simple solution. The solution is universal for general degree of local polynomial fitting and general order of estimated derivative. Furthermore, such local polynomial estimators are best among all linear estimators in a weak minimax sense. And they are highly efficient even in the usual linear minimax sense. 0 This research is part of MingYen Cheng's dissertation under the supervision of Professors J. Fan and J. S. Marron at th...
Assessing the quality of learned local models
 Advances in Neural Information Processing Systems 6
, 1994
"... An approach is presented to learning high dimensional functions in the case where the learning algorithm can affect the generation of new data. A local modeling algorithm, locally weighted regression, is used to represent the learned function. Architectural parameters of the approach, such as distan ..."
Abstract

Cited by 47 (16 self)
 Add to MetaCart
An approach is presented to learning high dimensional functions in the case where the learning algorithm can affect the generation of new data. A local modeling algorithm, locally weighted regression, is used to represent the learned function. Architectural parameters of the approach, such as distance metrics, are also localized and become a function of the query point instead of being global. Statistical tests are given for when a local model is good enough and sampling should be moved to a new area. Our methods explicitly deal with the case where prediction accuracy requirements exist during exploration: By gradually shifting a “center of exploration ” and controlling the speed of the shift with local prediction accuracy, a goaldirected exploration of state space takes place along the fringes of the current data support until the task goal is achieved. We illustrate this approach with simulation results and results from a real robot learning a complex juggling task. 1
Local Polynomial Estimation of Regression Functions for Mixing Processes
"... Local polynomial fitting has many exciting statistical properties which where established under i.i.d. setting. However, the need for nonlinear time series modeling, constructing predictive intervals, understanding divergence of nonlinear time series requires the development of the theory of local p ..."
Abstract

Cited by 43 (9 self)
 Add to MetaCart
(Show Context)
Local polynomial fitting has many exciting statistical properties which where established under i.i.d. setting. However, the need for nonlinear time series modeling, constructing predictive intervals, understanding divergence of nonlinear time series requires the development of the theory of local polynomial fitting for dependent data. In this paper, we study the problem of estimating conditional mean functions and their derivatives via a local polynomial fit. The functions include conditional moments, conditional distribution as well as conditional density functions. Joint asymptotic normality for derivative estimation is established for both strongly mixing and aemixing processes.