Results 1  10
of
28
Wavelet regression in random design with heteroscedastic dependent errors
 MR2549564 RR n° 7647 estimation in the
, 2009
"... We investigate function estimation in nonparametric regression models with random design and heteroscedastic correlated noise. Adaptive properties of warped wavelet nonlinear approximations are studied over a wide range of Besov scales, f ∈ B s π,r, and for a variety of L p error measures. We consid ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
(Show Context)
We investigate function estimation in nonparametric regression models with random design and heteroscedastic correlated noise. Adaptive properties of warped wavelet nonlinear approximations are studied over a wide range of Besov scales, f ∈ B s π,r, and for a variety of L p error measures. We consider error distributions with LongRangeDependence parameter α,0 < α ≤ 1; heteroscedasticity is modeled with a design dependent function σ. We prescribe a tuning paradigm, under which warped wavelet estimation achieves partial or full adaptivity results with the rates that are shown to be the minimax rates of convergence. For p> 2, it is seen that there are three rate phases, namely the dense, sparse and long range dependence phase, depending on the relative values of s,p,π and α. Furthermore, we show that long range dependence does not come into play for shape estimation f − ∫ f. The theory is illustrated with some numerical examples. 1. Introduction.
Iterative feature selection in least square regression estimation
 Ann. Inst. Henri Poincaré Probab. Stat
"... ha l0 ..."
On pointwise adaptive curve estimation based on inhomogeneous data. Preprint LPMA no 974 available at http://hal.ccsd.cnrs.fr/ccsd00004605/en
"... Abstract. We want to recover a signal based on noisy inhomogeneous data (the amount of data can vary strongly on the estimation domain). We model the data using nonparametric regression with random design, and we focus on the estimation of the regression at a fixed point x0 with little, or much data ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Abstract. We want to recover a signal based on noisy inhomogeneous data (the amount of data can vary strongly on the estimation domain). We model the data using nonparametric regression with random design, and we focus on the estimation of the regression at a fixed point x0 with little, or much data. We propose a method which adapts both to the local amount of data (the design density is unknown) and to the local smoothness of the regression function. The procedure consists of a local polynomial estimator with a Lepski type datadriven bandwidth selector, see for instance Lepski et al. (1997). We assess this procedure in the minimax setup, over a class of function with local smoothness s> 0 of Hölder type. We quantify the amount of data at x0 in terms of a local property on the design density called regular variation, which allows situations with strong variations in the concentration of the observations. Moreover, the optimality of the procedure is proved within this framework. 1.
Uniform estimation of a signal based on inhomogeneous data. Statist. Sinica 19 427–447. MR2514170 RR n° 7647 estimation in the nonparametric random coefficients binary choice model by needlet thresholding46
, 2009
"... Abstract. We want to reconstruct a signal based on inhomogeneous data (the amount of data can vary strongly), using the model of regression with a random design. Our aim is to understand the consequences of inhomogeneity on the accuracy of estimation within the minimax framework. Using the uniform m ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract. We want to reconstruct a signal based on inhomogeneous data (the amount of data can vary strongly), using the model of regression with a random design. Our aim is to understand the consequences of inhomogeneity on the accuracy of estimation within the minimax framework. Using the uniform metric weighted by a spatiallydependent rate as a benchmark for an estimator accuracy, we are able to capture the deformation of the usual minimax rate in situations with local lacks of data (modelled by a design density with vanishing points). In particular, we construct an estimator both design and smoothness adaptive, and a new criterion is developed to prove the optimality of these deformed rates. 1.
Analysis of Fractals, Image Compression, Entropy Encoding, KarhunenLoève Transforms
, 2008
"... be inserted by the editor) ..."
(Show Context)
Warped Wavelet and Vertical thresholding
, 2008
"... Let {(Xi, Yi)} i∈{1,...,n} be an i.i.d. sample from the random design regression model Y = f(X) + ε with (X, Y) ∈ [0, 1] × [−M,M]. In dealing with such a model, adaptation is naturally to be intended in terms of L 2 ([0, 1], GX) norm where GX(·) denotes the (known) marginal distribution of the des ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Let {(Xi, Yi)} i∈{1,...,n} be an i.i.d. sample from the random design regression model Y = f(X) + ε with (X, Y) ∈ [0, 1] × [−M,M]. In dealing with such a model, adaptation is naturally to be intended in terms of L 2 ([0, 1], GX) norm where GX(·) denotes the (known) marginal distribution of the design variable X. Recently much work has been devoted to the construction of estimators that adapts in this setting (see, for example, [5, 24, 25, 32]), but only a few of them come along with a easy–to–implement computational scheme. Here we propose a family of estimators based on the warped wavelet basis recently introduced by Picard and Kerkyacharian [36] and a treelike thresholding rule that takes into account the hierarchical (acrossscale) structure of the wavelet coefficients. We show that, if the regression function belongs to a certain class of approximation spaces defined in terms of GX(·), then our procedure is adaptive and converge to the true regression function with an optimal rate. The results are stated in terms of excess probabilities as in [19].
Adaptive Posterior Mode Estimation of a Sparse Sequence for Model Selection
"... For the problem of estimating a sparse sequence of coefficients of a parametric or nonparametric generalized linear model, posterior mode estimation with a Subbotin(λ, ν) prior achieves thresholding and therefore model selection when ν ∈ [0, 1] for a class of likelihood functions. The proposed estim ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
For the problem of estimating a sparse sequence of coefficients of a parametric or nonparametric generalized linear model, posterior mode estimation with a Subbotin(λ, ν) prior achieves thresholding and therefore model selection when ν ∈ [0, 1] for a class of likelihood functions. The proposed estimator also offers a continuum between the (forward/backward) best subset estimator (ν = 0), its approximate convexification called lasso (ν = 1) and ridge regression (ν = 2). Rather than fixing ν, selecting the two hyperparameters λ and ν adds flexibility for a better fit, provided both are well selected from the data. Considering first the canonical Gaussian model, we generalize the Stein unbiased risk estimate SURE(λ, ν) to the situation where the thresholding function is not almost differentiable (i.e., ν < 1). We then propose a more general selection of λ and ν by deriving an information criterion that can be employed for instance for the lasso or wavelet smoothing. We investigate some asymptotic properties in parametric and nonparametric settings. Simulations and applications to real data show excellent performance.
NONPARAMETRIC REGRESSION ESTIMATION BASED ON SPATIALLY INHOMOGENEOUS DATA: MINIMAX GLOBAL CONVERGENCE RATES AND ADAPTIVITY
, 2012
"... Abstract. We consider the nonparametric regression estimation problem of recovering an unknown response function f on the basis of spatially inhomogeneous data when the design points follow a known density g with a finite number of wellseparated zeros. In particular, we consider two different cases ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. We consider the nonparametric regression estimation problem of recovering an unknown response function f on the basis of spatially inhomogeneous data when the design points follow a known density g with a finite number of wellseparated zeros. In particular, we consider two different cases: when g has zeros of a polynomial order and when g has zeros of an exponential order. These two cases correspond to moderate and severe data losses, respectively. We obtain asymptotic (as the sample size increases) minimax lower bounds for the L2risk when f is assumed to belong to a Besov ball, and construct adaptive wavelet thresholding estimators of f that are asymptotically optimal (in the minimax sense) or nearoptimal within a logarithmic factor (in the case of a zero of a polynomial order), over a wide range of Besov balls. The spatially inhomogeneous illposed problem that we investigate is inherently more difficult than spatially homogeneous illposed problems like, e.g., deconvolution. In particular, due to spatial irregularity, assessment of asymptotic minimax global convergence rates is a much harder task than the derivation of asymptotic minimax local convergence rates studied recently in the literature. Furthermore, the resulting estimators exhibit very different behavior and asymptotic minimax global convergence rates in comparison with the solution of spatially homogeneous illposed problems. For example, unlike in the deconvolution problem, the asymptotic minimax global convergence rates are greatly influenced not only by the extent of data loss but also by the degree of spatial homogeneity of f. Specifically, even if 1/g is nonintegrable, one can recover f as well as in the case of an equispaced design (in terms of asymptotic minimax global convergence rates) when it is homogeneous enough since the estimator is “borrowing strength ” in the areas where f is adequately sampled.
Toulouse, France, and
, 2005
"... In this paper we focus on nonparametric estimation of a constrained regression function using penalized wavelet regression techniques. This results into a convex optimization problem under linear constraints. Necessary and sufficient conditions for existence of a unique solution are discussed. The e ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper we focus on nonparametric estimation of a constrained regression function using penalized wavelet regression techniques. This results into a convex optimization problem under linear constraints. Necessary and sufficient conditions for existence of a unique solution are discussed. The estimator is easily obtained via the dual formulation of the optimization problem. In particular we investigate a penalized wavelet monotone regression estimator. We establish the rate of convergence of this estimator, and illustrate its finite sample performance via a simulation study. We also compare its performance with that of a recently proposed constrained estimator. An illustration to some real data is given.