Results 11  20
of
30
Robust nonparametric regression via sparsity control with application to load curve data cleansing
 IEEE Transactions on Signal Processing
"... Abstract—Nonparametric methods are widely applicable to statistical inference problems, since they rely on a few modeling assumptions. In this context, the fresh look advocated here permeates benefits from variable selection and compressive sampling, to robustify nonparametric regression against ou ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
(Show Context)
Abstract—Nonparametric methods are widely applicable to statistical inference problems, since they rely on a few modeling assumptions. In this context, the fresh look advocated here permeates benefits from variable selection and compressive sampling, to robustify nonparametric regression against outliers—that is, data markedly deviating from the postulated models. A variational counterpart to leasttrimmed squares regression is shown closely related to an(pseudo)normregularized estimator, that encourages sparsity in a vector explicitly modeling the outliers. This connection suggests efficient solvers based on convex relaxation, which lead naturally to a variational Mtype estimator equivalent to the leastabsolute shrinkage and selection operator (Lasso). Outliers are identified by judiciously tuning regularization parameters, which amounts to controlling the sparsity of the outlier vector along the whole robustification path of Lasso solutions. Reduced bias and enhanced generalization capability are attractive features of an improved estimator obtained after replacing the(pseudo)norm with a nonconvex surrogate. The novel robust splinebased smoother is adopted to cleanse load curve data, a key task aiding operational decisions in the envisioned smart grid system. Computer simulations and tests on real load curve data corroborate the effectiveness of the novel sparsitycontrolling robust estimators. Index Terms—Lasso, load curve cleansing, nonparametric regression, outlier rejection, sparsity, splines. I.
Heuristic ParameterChoice Rules for Convex Variational Regularization Based on Error Estimates
 SIAM Jounal on Numerical Analysis
, 2010
"... estimates ..."
(Show Context)
On the minimization of a Tikhonov functional with a nonconvex sparsity
"... constraint ..."
(Show Context)
Beyond convergence rates: Exact recovery with the Tikhonov regularization with sparsity constraints
 Inverse Problems
, 2011
"... Abstract. The Tikhonov regularization of linear illposed problems with an `1 penalty is considered. We recall results for linear convergence rates and results on exact recovery of the support. Moreover, we derive conditions for exact support recovery which are especially applicable in the case of i ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract. The Tikhonov regularization of linear illposed problems with an `1 penalty is considered. We recall results for linear convergence rates and results on exact recovery of the support. Moreover, we derive conditions for exact support recovery which are especially applicable in the case of illposed problems, where other conditions, e.g. based on the socalled coherence or the restricted isometry property are usually not applicable. The obtained results also show that the regularized solutions do not only converge in the `1norm but also in the vector space `0 (when considered as the strict inductive limit of the spaces Rn as n tends to infinity). Additionally, the relations between different conditions for exact support recovery and linear convergence rates are investigated. With an imaging example from digital holography the applicability of the obtained results is illustrated, i.e. that one may check a priori if the experimental setup guarantees exact recovery with Tikhonov regularization with sparsity constraints. AMS classification scheme numbers: 47A52, 65J20 1.
Extraktion quantifizierbarer Information aus komplexen Systemen” Regularization With Nonconvex Separable Constraints
, 2009
"... Preprint 11The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will ..."
Abstract
 Add to MetaCart
Preprint 11The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will be distributed by the authors. Regularization with nonconvex separable constraints
Rule Weight Optimization and Feature Selection in Fuzzy Systems with Sparsity Constraints
, 2009
"... In this paper, we are dealing with a novel datadriven learning method (SparseFIS) for TakagiSugeno fuzzy systems, extended by including rule weights. Our learning method consists of three phases: the first phase conducts a clustering process in the input/output feature space with iterative vector ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper, we are dealing with a novel datadriven learning method (SparseFIS) for TakagiSugeno fuzzy systems, extended by including rule weights. Our learning method consists of three phases: the first phase conducts a clustering process in the input/output feature space with iterative vector quantization. Hereby, the number of clusters = rules is predefined and denotes a kind of upper bound on a reasonable granularity. The second phase optimize the rule weights in the fuzzy systems with respect to least squares error measure by applying a sparsityconstrained steepest descent optimization procedure. This is done in a coherent optimization procedure together with elicitation of consequent parameters. Depending on the sparsity threshold, more or less rules weights can be forced towards 0, switching off some rules. In this sense, a rule selection is achieved. The third phase estimates the linear consequent parameters by a regularized sparsity constrained optimization procedure for each rule separately (local learning approach). Sparsity constraints are applied here in order to force linear parameters to be 0, triggering a feature selection mechanism per rule. In some cases, this may also yield a global feature selection, whenever the linear parameters of some features in each rule are near 0. The method is evaluated based on highdimensional data from industrial processes and based on benchmark data sets from the internet and compared to wellknown batch training methods in terms of accuracy and complexity of the fuzzy systems.
Author manuscript, published in "SPARS'09 Signal Processing with Adaptive Sparse Structured Representations (2009)" An active set approach to the elasticnet and its applications in mass spectrometry
, 2009
"... Abstract—This paper uses the framework of a Mass Spectrometry application to introduce a new method of peak picking as well as two active set methods for the minimization of the elasticnetfunctional. The application of peak picking is essential in mass spectrometry and is often based on mean spect ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—This paper uses the framework of a Mass Spectrometry application to introduce a new method of peak picking as well as two active set methods for the minimization of the elasticnetfunctional. The application of peak picking is essential in mass spectrometry and is often based on mean spectra. In contrast our procedure uses a set of spectra obtained from a basis learning method. Our procedure utilizes the well known ℓ 1minimization and corresponding active set algorithms but comprises ill conditioned operators such that regularization is required. We show, that the elasticnet gives a natural justification for TikhonovPhilipsregularization in the used algorithms. Therefore we introduce adaptions of known active set algorithms for ℓ 1minimization to the elastic net. Furthermore, we emphasize the differences of the algorithms for ℓ 1 and the elasticnet in numerical examples. I.
TIKHONOV REGULARIZATION WITH SPARSITY CONSTRAINTS
, 1324
"... The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will be distrib ..."
Abstract
 Add to MetaCart
The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will be distributed by the authors.
Bounds and Exact Inversion
, 2009
"... The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will be distrib ..."
Abstract
 Add to MetaCart
The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will be distributed by the authors.