DMCA
Design-adaptive Nonparametric Regression." (1992)
Venue: | Journal of the American Statistical Association, |
Citations: | 435 - 28 self |
Citations
1250 |
Robust locally weighted regression and smoothing scatterplots
- Cleveland
- 1979
(Show Context)
Citation Context .... In a small neighborhood of a point x , m ( y )- m ( x )+ m l ( x ) ( y - x ) = a + b ( y- x ) . Thus the problem of estimating m ( x ) is equivalent to a local linear regression problem: estimating the intercept a . Now consider a weighted (local) linear regression: finding a and b to minimize Let a and 8 be the solution to the weighted least squares problem (2.1).Simple calculation yields where w, is defined by (2.3).Thus we define the local linear regression smoother by (2.2) I with where This idea is an extension of Stone (1977),who used a kernel function K ( x )= 4 1 and was studied by Cleveland (1979),Fan (in press), Lejeune ( 1 985),Miiller (1987), and Tsybakov (1986).Note that & ( x ) is a weighted average of the responses and is called a linear smoother in the literature. The intuition at the beginning of this section suggests that 6 estimates m l ( x ) .Discussions on the behavior o f 8 are beyond the scope of this article. The bandwidth h, can be chosen either subjectively by data analysts or objectively by data. A frequently used bandwidth selection technique is the cross-validation method (Stone 1977),which chooses h, to minimize where &-,( .) is the regression estimator (2.2),witho... |
339 |
Spline Smoothing and Nonparametric Regression.
- EUBANK
- 1988
(Show Context)
Citation Context ... all regression settings and do not require any modifications even at boundarq. Besides. this method has higher efficiency than other traditional nonpararnetric regression methods. KEY WORDS: Boundary effects: Kernel estimator: Linear smoother: Local linear regression: Minimax efficiency 1. INTRODUCTION Consider bivariate data that can be thought of as a random sample from a certain population. It is common practice to study the association between covariates and responses via regression analysis. Nonparametric regression provides a useful explanatory and diagnostic tool for this purpose. See Eubank (1988), Hardle (1990), and Muller (1988) for many examples of this and good introductions to the general subject area. Let (XI , YI ), . . . , (X,,, Y,,) be a random sample from a population having a density f'( x, y) . Let fx( x)be the marginal density of X . Denote the regression function by m ( x ) = E(Y I X = x ) and the conditional variance function by a2(x)= var(Y I X = x) . Several methods have been proposed for estimating m ( x ) : kernel, spline. and orthogonal series methods. Among these are two popular kernel methods proposed by Gasser and Miiller (1979), Nadaraya (1964), and Watson ( 196... |
326 | Smooth regression analysis - Watson - 1964 |
279 |
Consistent Nonparametric Regression,”
- Stone
- 1977
(Show Context)
Citation Context ...L LINEAR REGRESSION Suppose that the second derivative of m ( x ) exists. In a small neighborhood of a point x , m ( y )- m ( x )+ m l ( x ) ( y - x ) = a + b ( y- x ) . Thus the problem of estimating m ( x ) is equivalent to a local linear regression problem: estimating the intercept a . Now consider a weighted (local) linear regression: finding a and b to minimize Let a and 8 be the solution to the weighted least squares problem (2.1).Simple calculation yields where w, is defined by (2.3).Thus we define the local linear regression smoother by (2.2) I with where This idea is an extension of Stone (1977),who used a kernel function K ( x )= 4 1 and was studied by Cleveland (1979),Fan (in press), Lejeune ( 1 985),Miiller (1987), and Tsybakov (1986).Note that & ( x ) is a weighted average of the responses and is called a linear smoother in the literature. The intuition at the beginning of this section suggests that 6 estimates m l ( x ) .Discussions on the behavior o f 8 are beyond the scope of this article. The bandwidth h, can be chosen either subjectively by data analysts or objectively by data. A frequently used bandwidth selection technique is the cross-validation method (Stone 1977),which ... |
90 |
Kernel estimation of regression functions
- Gasser, Mueller
- 1979
(Show Context)
Citation Context ...tory and diagnostic tool for this purpose. See Eubank (1988), Hardle (1990), and Muller (1988) for many examples of this and good introductions to the general subject area. Let (XI , YI ), . . . , (X,,, Y,,) be a random sample from a population having a density f'( x, y) . Let fx( x)be the marginal density of X . Denote the regression function by m ( x ) = E(Y I X = x ) and the conditional variance function by a2(x)= var(Y I X = x) . Several methods have been proposed for estimating m ( x ) : kernel, spline. and orthogonal series methods. Among these are two popular kernel methods proposed by Gasser and Miiller (1979), Nadaraya (1964), and Watson ( 1964). With K being a kernel and h, being a bandwidth, Table 1 summarizes the asymptotic behavior of the Nadaraya-Watson estimator (3.4), the Gasser-Muller estimator (3 .5 ) , and the local linear (regression) smoother (2.2) to be introduced in Section 2. The bias of the Nadaraya-Watson estimator depends on the intrinsic part m" interplaying with the artifact m!f f'x /j;i due to the local constant fit. Keeping m"(x) fixed, we first remark that in the highly clustered design where 1.f; ( x ) / fx( x) 1 is large, the bias of the Nadaraya-Watson estimator is large.... |
57 | On Optimal Data-Based Bandwidth Selection in Kernel Density Estimation," - Hall, Sheather, et al. - 1991 |
43 |
Robust Reconstruction of Functions by the LocalApproximation Method," Problems oflnformation
- Tsybakov
- 1986
(Show Context)
Citation Context ... y - x ) = a + b ( y- x ) . Thus the problem of estimating m ( x ) is equivalent to a local linear regression problem: estimating the intercept a . Now consider a weighted (local) linear regression: finding a and b to minimize Let a and 8 be the solution to the weighted least squares problem (2.1).Simple calculation yields where w, is defined by (2.3).Thus we define the local linear regression smoother by (2.2) I with where This idea is an extension of Stone (1977),who used a kernel function K ( x )= 4 1 and was studied by Cleveland (1979),Fan (in press), Lejeune ( 1 985),Miiller (1987), and Tsybakov (1986).Note that & ( x ) is a weighted average of the responses and is called a linear smoother in the literature. The intuition at the beginning of this section suggests that 6 estimates m l ( x ) .Discussions on the behavior o f 8 are beyond the scope of this article. The bandwidth h, can be chosen either subjectively by data analysts or objectively by data. A frequently used bandwidth selection technique is the cross-validation method (Stone 1977),which chooses h, to minimize where &-,( .) is the regression estimator (2.2),without using the jth observation (X,, Y , ) . An alternative method is th... |
28 | Choosing a Kernel Regression Estimator," - Chu, Marron - 1992 |
20 | Expert face processing: Specialization and constraints. In - Schwaninger, Carbon, et al. - 2003 |
19 |
A Unifying Approach to Nonparametric Regression Estimation,"
- Jennen-Steinmetz, Gasser
- 1988
(Show Context)
Citation Context ...maximum risk over e 2 , say, is infinity and its asymptotic minimax efficiency is 0. But in the case of uniform designs, the Nadaraya-Watson estimator has the same asymptotic properties as the local linear regression smoother. Remark 3. Gasser and Muller (1979) defined the following estimator: x - t S G M ( X )= Y; [, K(T) dl, (3.5) j = 1 where { (Xi , Y:)) are ordered samples according to XJs, to = -m, t, = a,and tj = (X(, + XJ+1) /2 . The variance of the local linear smoother is only two thirds of a corresponding Gasser-Muller estimator, while the bias is the same. (See Chu and Marron 1990; Jennen-Steinmetz and Gasser 1988; and Mack and Muller 1989 for the expression of the variance of the Gasser-Muller estimator.) Thus, in the case of random designs the latter estimator uses only two-thirds of the available data and is not admissible. For fixed designs, however, these two smoothers have the same asymptotic performance. 4. BEST LINEAR SMOOTHERS Definition 1. A linear smoother is defined by the following weighted average: n &L(x) = C WJ(X1, . . . ,Xn)Y. (4.1) 1-1 It is obvious that estimator (2.2) is a linear smoother, as are the Nadaraya-Watson and Gasser-Muller estimators. 1002 Estimation of regression using t... |
6 |
Weighted Local Regression and Kernel Methods for Nonparametric Curve Fitting,"
- Miiller
- 1987
(Show Context)
Citation Context ...m ( x )+ m l ( x ) ( y - x ) = a + b ( y- x ) . Thus the problem of estimating m ( x ) is equivalent to a local linear regression problem: estimating the intercept a . Now consider a weighted (local) linear regression: finding a and b to minimize Let a and 8 be the solution to the weighted least squares problem (2.1).Simple calculation yields where w, is defined by (2.3).Thus we define the local linear regression smoother by (2.2) I with where This idea is an extension of Stone (1977),who used a kernel function K ( x )= 4 1 and was studied by Cleveland (1979),Fan (in press), Lejeune ( 1 985),Miiller (1987), and Tsybakov (1986).Note that & ( x ) is a weighted average of the responses and is called a linear smoother in the literature. The intuition at the beginning of this section suggests that 6 estimates m l ( x ) .Discussions on the behavior o f 8 are beyond the scope of this article. The bandwidth h, can be chosen either subjectively by data analysts or objectively by data. A frequently used bandwidth selection technique is the cross-validation method (Stone 1977),which chooses h, to minimize where &-,( .) is the regression estimator (2.2),without using the jth observation (X,, Y , ) . An alt... |
1 | Estimation Nonparamktrique Par Noyaux: Rkgression Polynomiale Mobile, - Lejeune - 1985 |
1 | Convolution-Type Estimators for Nonparametric Regression Estimation," - Mack, Miiller - 1989 |
1 |
On Estimating Regression," Theory ofProbability and Its
- Nadaraya
- 1964
(Show Context)
Citation Context ...r this purpose. See Eubank (1988), Hardle (1990), and Muller (1988) for many examples of this and good introductions to the general subject area. Let (XI , YI ), . . . , (X,,, Y,,) be a random sample from a population having a density f'( x, y) . Let fx( x)be the marginal density of X . Denote the regression function by m ( x ) = E(Y I X = x ) and the conditional variance function by a2(x)= var(Y I X = x) . Several methods have been proposed for estimating m ( x ) : kernel, spline. and orthogonal series methods. Among these are two popular kernel methods proposed by Gasser and Miiller (1979), Nadaraya (1964), and Watson ( 1964). With K being a kernel and h, being a bandwidth, Table 1 summarizes the asymptotic behavior of the Nadaraya-Watson estimator (3.4), the Gasser-Muller estimator (3 .5 ) , and the local linear (regression) smoother (2.2) to be introduced in Section 2. The bias of the Nadaraya-Watson estimator depends on the intrinsic part m" interplaying with the artifact m!f f'x /j;i due to the local constant fit. Keeping m"(x) fixed, we first remark that in the highly clustered design where 1.f; ( x ) / fx( x) 1 is large, the bias of the Nadaraya-Watson estimator is large. Thus this estima... |