Results 1 
9 of
9
ElasticNet Regularization: Error estimates and Active Set Methods
, 905
"... This paper investigates theoretical properties and efficient numerical algorithms for the socalled elasticnet regularization originating from statistics, which enforces simultaneously ℓ 1 and ℓ 2 regularization. The stability of the minimizer and its consistency are studied, and convergence rates ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
(Show Context)
This paper investigates theoretical properties and efficient numerical algorithms for the socalled elasticnet regularization originating from statistics, which enforces simultaneously ℓ 1 and ℓ 2 regularization. The stability of the minimizer and its consistency are studied, and convergence rates for both a priori and a posteriori parameter choice rules are established. Two iterative numerical algorithms of active set type are proposed, and their convergence properties are discussed. Numerical results are presented to illustrate the features of the functional and algorithms. 1
Heuristic ParameterChoice Rules for Convex Variational Regularization Based on Error Estimates
 SIAM Jounal on Numerical Analysis
, 2010
"... estimates ..."
(Show Context)
The Residual Method for Regularizing IllPosed Problems
, 2009
"... Although the residual method, or constrained regularization, is frequently used in applications, a detailed study of its properties is still missing. In particular, the questions of stability and convergence rates have hardly been treated in the literature. This sharply contrasts the progress of the ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
Although the residual method, or constrained regularization, is frequently used in applications, a detailed study of its properties is still missing. In particular, the questions of stability and convergence rates have hardly been treated in the literature. This sharply contrasts the progress of the theory of Tikhonov regularization, where for instance the notion of the Bregman distance has recently led to a series of new results for regularization in Banach spaces. The present paper intends to bridge the gap between the existing theories as far as possible. We develop a stability and convergence theory for the residual method in general topological spaces. In addition, we prove convergence rates in terms of (generalized) Bregman distances. Exemplarily, we apply our theory to compressed sensing. Here, we show the wellposedness of the method and derive convergence rates both for convex and nonconvex regularization under rather weak conditions. It is for instance shown that in the nonconvex setting the linear convergence of the regularized solutions already follows from the sparsity of the true solution and the injectivity of the (linear) operator in the equation to be solved.
Convergence rates in ℓ 1 regularization if the sparsity assumption fails
 Inverse Problems
"... Variational sparsity regularization based on ℓ 1norms and other nonlinear functionals has gained enormous attention recently, both with respect to its applications and its mathematical analysis. A focus in regularization theory has been to develop error estimation in terms of regularization paramet ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
Variational sparsity regularization based on ℓ 1norms and other nonlinear functionals has gained enormous attention recently, both with respect to its applications and its mathematical analysis. A focus in regularization theory has been to develop error estimation in terms of regularization parameter and noise strength. For this sake specific error measures such as Bregman distances and specific conditions on the solution such as source conditions or variational inequalities have been developed and used. In this paper we provide, for a certain class of illposed linear operator equations, a convergence analysis that works for solutions that are not completely sparse, but have a fast decaying nonzero part. This case is not covered by standard source conditions, but surprisingly can be treated with an appropriate variational inequality. As a consequence the paper also provides the first examples where the variational inequality approach, which was often believed to be equivalent to appropriate source conditions, can indeed go farther than the latter.
Beyond convergence rates: Exact recovery with the Tikhonov regularization with sparsity constraints
 Inverse Problems
, 2011
"... Abstract. The Tikhonov regularization of linear illposed problems with an `1 penalty is considered. We recall results for linear convergence rates and results on exact recovery of the support. Moreover, we derive conditions for exact support recovery which are especially applicable in the case of i ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract. The Tikhonov regularization of linear illposed problems with an `1 penalty is considered. We recall results for linear convergence rates and results on exact recovery of the support. Moreover, we derive conditions for exact support recovery which are especially applicable in the case of illposed problems, where other conditions, e.g. based on the socalled coherence or the restricted isometry property are usually not applicable. The obtained results also show that the regularized solutions do not only converge in the `1norm but also in the vector space `0 (when considered as the strict inductive limit of the spaces Rn as n tends to infinity). Additionally, the relations between different conditions for exact support recovery and linear convergence rates are investigated. With an imaging example from digital holography the applicability of the obtained results is illustrated, i.e. that one may check a priori if the experimental setup guarantees exact recovery with Tikhonov regularization with sparsity constraints. AMS classification scheme numbers: 47A52, 65J20 1.
Sparsity and Compressed Sensing in Inverse Problems
"... Abstract This chapter is concerned with two important topics in the context of sparse recovery in inverse and illposed problems. In first part we elaborate conditions for exact recovery. In particular, we describe how both `1minimization and matching pursuit methods can be used to regularize ill ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract This chapter is concerned with two important topics in the context of sparse recovery in inverse and illposed problems. In first part we elaborate conditions for exact recovery. In particular, we describe how both `1minimization and matching pursuit methods can be used to regularize illposed problems and moreover, state conditions which guarantee exact recovery of the support in the sparse case. The focus of the second part is on the incomplete data scenario. We discuss extensions of compressed sensing for specific infinite dimensional illposed measurement regimes. We are able to establish recovery error estimates when adequately relating the isometry constant of the sensing operator, the illposedness of the underlying model operator and the regularization parameter. Finally, we very briefly sketch how projected steepest descent iterations can be applied to retrieve the sparse solution.
TIKHONOV REGULARIZATION WITH SPARSITY CONSTRAINTS
, 1324
"... The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will be distrib ..."
Abstract
 Add to MetaCart
The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will be distributed by the authors.
Bounds and Exact Inversion
, 2009
"... The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will be distrib ..."
Abstract
 Add to MetaCart
(Show Context)
The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will be distributed by the authors.
1Constructing test instances for Basis Pursuit Denoising
"... Abstract—The number of available algorithms for the socalled Basis Pursuit Denoising problem (or the related LASSOproblem) is large and keeps growing. Similarly, the number of experiments to evaluate and compare these algorithms on different instances is growing. In this note, we present a method ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—The number of available algorithms for the socalled Basis Pursuit Denoising problem (or the related LASSOproblem) is large and keeps growing. Similarly, the number of experiments to evaluate and compare these algorithms on different instances is growing. In this note, we present a method to produce instances with exact solutions which is based on a simple observation which is related to the so called source condition from sparse regularization. EDICS: DSPRECO, DSPALGO I.