Results 1  10
of
12
Local linear quantile regression with dependent censored data
 Statist. Sinica
, 2009
"... Abstract: We consider the problem of nonparametrically estimating the conditional quantile function from censored dependent data. The method proposed here is based on a local linear fit using the check function approach. The asymptotic properties of the proposed estimator are established. Since the ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Abstract: We consider the problem of nonparametrically estimating the conditional quantile function from censored dependent data. The method proposed here is based on a local linear fit using the check function approach. The asymptotic properties of the proposed estimator are established. Since the estimator is defined as a solution of a minimization problem, we also propose a numerical algorithm. We investigate the performance of the estimator for small samples through a simulation study, and we also discuss the optimal choice of the bandwidth parameters. Key words and phrases: Censoring, kernel smoothing, local linear smoothing, mixing sequences, nonparametric regression, quantile regression, strong mixing, survival analysis. 1.
Local Rank Inference for Varying Coefficient Models,” technical report, The Methodology
, 2009
"... By allowing the regression coefficients to change with certain covariates, the class of varying coefficient models offers a flexible approach to modeling nonlinearity and interactions between covariates. This article proposes a novel estimation procedure for the varying coefficient models based on l ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
By allowing the regression coefficients to change with certain covariates, the class of varying coefficient models offers a flexible approach to modeling nonlinearity and interactions between covariates. This article proposes a novel estimation procedure for the varying coefficient models based on local ranks. The new procedure provides a highly efficient and robust alternative to the local linear least squares method, and can be conveniently implemented using existing R software package. Theoretical analysis and numerical simulations both reveal that the gain of the local rank estimator over the local linear least squares estimator, measured by the asymptotic mean squared error or the asymptotic mean integrated squared error, can be substantial. In the normal error case, the asymptotic relative efficiency for estimating both the coefficient functions and the derivative of the coefficient functions is above 96%; even in the worst case scenarios, the asymptotic relative efficiency has a lower bound 88.96 % for estimating the coefficient functions, and a lower bound 89.91 % for estimating their derivatives. The new estimator may achieve the nonparametric convergence rate even when the local linear least squares method fails due to infinite random error variance. We establish the large sample theory of the proposed procedure by utilizing results from generalized Ustatistics, whose kernel function may depend on the sample size. We also extend a resampling approach, which perturbs the objective function repeatedly, to the generalized Ustatistics setting, and demonstrate that it can accurately estimate the asymptotic covariance matrix.
Knot selection by boosting techniques
 Computational Statistics & Data Analysis
, 2007
"... A novel concept for estimating smooth functions by selection techniques based on boosting is developed. It is suggested to put radial basis functions with different spreads at each knot and to do selection and estimation simultaneously by a componentwise boosting algorithm. The methodology of variou ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
A novel concept for estimating smooth functions by selection techniques based on boosting is developed. It is suggested to put radial basis functions with different spreads at each knot and to do selection and estimation simultaneously by a componentwise boosting algorithm. The methodology of various other smoothing and knot selection procedures (e.g. stepwise selection) is summarized. They are compared to the proposed approach by extensive simulations for various unidimensional settings, including varying spatial variation and heteroskedasticity, as well as on a real world data example. Finally, an extension of the proposed method to surface fitting is evaluated numerically on both, simulation and real data. The proposed knot selection technique is shown to be a strong competitor to existing methods for knot selection.
SOME THOUGHTS ABOUT THE DESIGN OF LOSS FUNCTIONS
"... • The choice and design of loss functions is discussed. Particularly when computational methods like crossvalidation are applied, there is no need to stick to “standard” loss functions such as the L2loss (squared loss). Our main message is that the choice of a loss function in a practical situatio ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
• The choice and design of loss functions is discussed. Particularly when computational methods like crossvalidation are applied, there is no need to stick to “standard” loss functions such as the L2loss (squared loss). Our main message is that the choice of a loss function in a practical situation is the translation of an informal aim or interest that a researcher may have into the formal language of mathematics. The choice of a loss function cannot be formalized as a solution of a mathematical decision problem in itself. An illustrative case study about the location of branches of a chain of restaurants is given. Statistical aspects of loss functions are treated, such as the distinction between applications of loss functions to prediction and estimation problems and the direct definition of estimators to minimize loss functions. The impact of subjective decisions to the design of loss functions is also emphasized and discussed. KeyWords: • prediction; estimation; decision theory; Mestimator; MMestimator; linear regression. AMS Subject Classification:
Corresponding Author
"... In this paper, under a nonparametric regression model, we introduce a family of robust procedures to estimate the regression funtion when missing data occur in the response. Our proposal is based on a local M−functional applied to the conditional distribution function estimate adapted to the presenc ..."
Abstract
 Add to MetaCart
In this paper, under a nonparametric regression model, we introduce a family of robust procedures to estimate the regression funtion when missing data occur in the response. Our proposal is based on a local M−functional applied to the conditional distribution function estimate adapted to the presence of missing data. We show that the robust procedure is consistent and asymptotically normally distributed.
On Robustness in Kernel Based Regression
"... It is wellknown that Kernel Based Regression (KBR) with a least squares loss has some undesirable properties from robustness point of view. KBR with more robust loss functions, e.g. Huber or logistic losses, often give rise to more complicated computations and optimization problems. In classical st ..."
Abstract
 Add to MetaCart
(Show Context)
It is wellknown that Kernel Based Regression (KBR) with a least squares loss has some undesirable properties from robustness point of view. KBR with more robust loss functions, e.g. Huber or logistic losses, often give rise to more complicated computations and optimization problems. In classical statistics, robustness is improved by reweighting the original estimate. We study reweighting the KBR estimate using four different weight functions. In addition, we show that both the smoother as well as the crossvalidation procedure have to be robust in order to obtain a fully robust procedure. 1
Local Polynomial Quantile Regression with Parametric Features
, 2009
"... We propose a new approach to conditional quantile function estimation that combines both parametric and nonparametric techniques. At each design point, a global, possibly incorrect, pilot parametric model is locally adjusted through a kernel smoothing fit. The resulting quantile regression estimator ..."
Abstract
 Add to MetaCart
(Show Context)
We propose a new approach to conditional quantile function estimation that combines both parametric and nonparametric techniques. At each design point, a global, possibly incorrect, pilot parametric model is locally adjusted through a kernel smoothing fit. The resulting quantile regression estimator behaves like a parametric one when the latter is correct and converges to the nonparametric solution as the parametric start deviates from the true underlying model. We give a Bahadurtype representation of the proposed estimator from which consistency and asymptotic normality are derived under αmixing assumption. We also propose a practical bandwidth selector based on the plugin principle and discuss the numerical implementation of the new estimator. Finally, we investigate the performance of the proposed method via simulations and we illustrate the methodology on a data example. KEY WORDS: Bias reduction; Local polynomial smoothing; Model misspecification; Robustness; It is known from the literature that regression function estimators based on least squares are optimal and are equivalent to the maximum likelihood estimators when errors follow a normal