Results 1 
3 of
3
Sharp Adaptation for Inverse Problems With Random Noise
, 2000
"... We consider a heteroscedastic sequence space setup with polynomially increasing variances of observations that allows to treat a number of inverse problems, in particular multivariate ones. We propose an adaptive estimator that attains simultaneously exact asymptotic minimax constants on every ellip ..."
Abstract

Cited by 84 (8 self)
 Add to MetaCart
We consider a heteroscedastic sequence space setup with polynomially increasing variances of observations that allows to treat a number of inverse problems, in particular multivariate ones. We propose an adaptive estimator that attains simultaneously exact asymptotic minimax constants on every ellipsoid of functions within a wide scale (that includes ellipoids with polynomially and exponentially decreasing axes) and, at the same time, satisfies asymptotically exact oracle inequalities within any class of linear estimates having monotone nondecreasing weights. As application, we construct sharp adaptive estimators in the problems of deconvolution and tomography.
Estimation of Piecewisesmooth Functions by Amalgamated Bridge Regression Splines
"... We consider nonparametric estimation of a onedimensional piecewisesmooth function observed with white Gaussian noise on an interval. We propose a twostep estimation procedure, where one first detects jump points by a waveletbased procedure and then estimates the function on each smooth segment s ..."
Abstract
 Add to MetaCart
(Show Context)
We consider nonparametric estimation of a onedimensional piecewisesmooth function observed with white Gaussian noise on an interval. We propose a twostep estimation procedure, where one first detects jump points by a waveletbased procedure and then estimates the function on each smooth segment separately by bridge regression splines. We prove the asymptotic optimality (in the minimax sense) of the resulting amalgamated bridge regression spline estimator and demonstrate its efficiency on simulated and real data examples. AMS (2000) subject classification. Primary.
On Weakly Bounded Noise in IllPosed Problems
, 2008
"... We study compact operator equations with noisy data in Hilbert space. Instead of assuming that the error in the data converges strongly to 0, we only assume weak convergence. Under the usual source conditions, we derive optimal convergence rates for convexly constrained PhillipsTikhonov regularizat ..."
Abstract
 Add to MetaCart
We study compact operator equations with noisy data in Hilbert space. Instead of assuming that the error in the data converges strongly to 0, we only assume weak convergence. Under the usual source conditions, we derive optimal convergence rates for convexly constrained PhillipsTikhonov regularization. We also discuss a discrepancy principle and prove its asymptotic behavior. As an example, we discuss compact integral equations in L2(0, 1) with data perturbed by white noise, as well as the discrete case.