Results 1  10
of
15
A Direct Algorithm for 1D Total Variation Denoising
"... Abstract—A very fast noniterative algorithm is proposed for denoising or smoothing onedimensional discrete signals, by solving the total variation regularized leastsquares problem or the related fused lasso problem. A C code implementation is available on the web page of the author. Index Terms—To ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
Abstract—A very fast noniterative algorithm is proposed for denoising or smoothing onedimensional discrete signals, by solving the total variation regularized leastsquares problem or the related fused lasso problem. A C code implementation is available on the web page of the author. Index Terms—Total variation, denoising, nonlinear smoothing, fused lasso, regularized leastsquares, nonparametric regression, convex nonsmooth optimization, taut string signal y cumulative sum sequence r hal00675043, version 4 11 Aug 2013 I.
Properties of Higher Order Nonlinear Diffusion Filtering
"... This paper provides a mathematical analysis of higher order variational methods and nonlinear diffusion filtering for image denoising. Besides the average grey value, it is shown that higher order diffusion filters preserve higher moments of the initial data. While a maximumminimum principle in gene ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
(Show Context)
This paper provides a mathematical analysis of higher order variational methods and nonlinear diffusion filtering for image denoising. Besides the average grey value, it is shown that higher order diffusion filters preserve higher moments of the initial data. While a maximumminimum principle in general does not hold for higher order filters, we derive stability in the 2norm in the continuous and discrete setting. Considering the filters in terms of forward and backward diffusion, one can explain how not only the preservation, but also the enhancement of certain features in the given data is possible. Numerical results show the improved denoising capabilities of higher order filtering compared to the classical methods. 1
Generalized methods and solvers for noise removal from piecewise constant signals. i. background theory
 Proceedings of the Royal Society A: Mathematical, Physical and Engineering Science
"... Removing noise from signals which are piecewise constant (PWC) is a challenging signal processing problem that arises in many practical scientific and engineering contexts. For example, in exploration geosciences, noisy drill hole records must be separated into constant stratigraphic zones, and in b ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
Removing noise from signals which are piecewise constant (PWC) is a challenging signal processing problem that arises in many practical scientific and engineering contexts. For example, in exploration geosciences, noisy drill hole records must be separated into constant stratigraphic zones, and in biophysics, the jumps between states and dwells of a molecular structure need to be determined from noisy fluorescence microscopy signals. This problem is one for which conventional linear signal processing methods are fundamentally unsuited. A wide range of PWC denoising methods exists, including total variation regularization, mean shift clustering, stepwise jump placement, running median filtering, convex clustering shrinkage, bilateral filtering, wavelet shrinkage and hidden Markov models. This paper builds on results from the image processing community to show that the majority of these algorithms, and more proposed in the wider literature, are each associated with a special case of a generalized functional, that, when minimized, solves the PWC denoising problem. We show how the minimizer can be obtained by a range of computational solver algorithms, including stepwise jump placement, quadratic or linear programming, finite differences with and without adaptive step size, iterated running medians, least angle regression, piecewiselinear regularization path following, or coordinate descent. Using this generalized functional, we introduce several novel PWC denoising methods, which, for example,
Fast and Flexible ADMM Algorithms for Trend Filtering
"... This paper focuses on computational approaches for trend filtering, using the alternating direction method of multipliers (ADMM). We argue that, under an appropriate parametrization, ADMM is highly efficient for trend filtering, competitive with the specialized interior point methods that are curren ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
This paper focuses on computational approaches for trend filtering, using the alternating direction method of multipliers (ADMM). We argue that, under an appropriate parametrization, ADMM is highly efficient for trend filtering, competitive with the specialized interior point methods that are currently in use. Furthermore, the proposed ADMM implementation is very simple, and importantly, it is flexible enough to extend to many interesting related problems, such as sparse trend filtering and isotonic trend filtering. Software for our method will be made freely available, written in C++ (see
G.: Variational methods with higherorder derivatives in image processing
 Approximation Theory XII: San Antonio 2007
, 2008
"... Abstract. This is an overview of recent research of the authors on the application of variational methods with higher–order derivatives in image processing. We focus on grayvalued and matrixvalued images and deal with a purely discrete setting. We show that regularization methods with second–orde ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. This is an overview of recent research of the authors on the application of variational methods with higher–order derivatives in image processing. We focus on grayvalued and matrixvalued images and deal with a purely discrete setting. We show that regularization methods with second–order derivatives can be successfully applied to the denoising of gray–value images. In 1D the solutions of the corresponding minimization problems are discrete polynomial splines (sometimes with higher defects) and infconvolution splines with certain knots. The proposed methods can be transferred to matrix fields. Due to the operator structure of matrices, new tasks like the preservation of positive definiteness and the meaningful coupling of the matrix components come into play. In recent years mathematical methods from optimization theory, harmonic analysis, stochastics or partial differential equations were successfully ap
Robust Poisson Surface Reconstruction
"... Abstract. We propose a method to reconstruct surfaces from oriented point clouds with nonuniform sampling and noise by formulating the problem as a convex minimization that reconstructs the indicator function of the surface’s interior. Compared to previous models, our reconstruction is robust to ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. We propose a method to reconstruct surfaces from oriented point clouds with nonuniform sampling and noise by formulating the problem as a convex minimization that reconstructs the indicator function of the surface’s interior. Compared to previous models, our reconstruction is robust to noise and outliers because it substitutes the leastsquares fidelity term by a robust Huber penalty; this allows to recover sharp corners and avoids the shrinking bias of least squares. We choose an implicit parametrization to reconstruct surfaces of unknown topology and close large gaps in the point cloud. For an efficient representation, we approximate the implicit function by a hierarchy of locally supported basis elements adapted to the geometry of the surface. Unlike adhoc bases over an octree, our hierarchical Bsplines from isogeometric analysis locally adapt the mesh and degree of the splines during reconstruction. The hierarchical structure of the basis speedsup the minimization and efficiently represents clustered data. We also advocate for convex optimization, instead isogeometric finiteelement techniques, to efficiently solve the minimization and allow for nondifferentiable functionals. Experiments show stateoftheart performance within a more flexible framework. 1
Larry Wasserman, CoChair
, 2015
"... analysis, randomized algorithms. To my pampering parents, C.P. and Nalini. This thesis makes fundamental computational and statistical advances in testing and estimation, making critical progress in theory and application of classical statistical methods like classification, regression and hypothes ..."
Abstract
 Add to MetaCart
analysis, randomized algorithms. To my pampering parents, C.P. and Nalini. This thesis makes fundamental computational and statistical advances in testing and estimation, making critical progress in theory and application of classical statistical methods like classification, regression and hypothesis testing, and understanding the relationships between them. Our work connects multiple fields in often counterintuitive and surprising ways, leading to new theory, new algorithms, and new insights, and ultimately to a crossfertilization of varied fields like optimization, statistics and machine learning. The first of three thrusts has to do with active learning, a form of sequential learning from feedbackdriven queries that often has a provable statistical advantage over passive learning. We unify concepts from two seemingly different areas — active learning and stochastic firstorder optimization. We use this unified view to develop new lower bounds for stochastic optimization using tools from active learning and new algorithms for active learning using ideas from optimization. We also study the effect of feature noise, or errorsinvariables, on
Nonparametric Regression Statistical Machine Learning, Spring 2015
"... 1.1 Basic setup, random inputs • Given a random pair (X,Y) ∈ Rd × R, recall that the function f0(x) = E(Y X = x) is called the regression function (of Y on X). The basic goal in nonparametric regression is ..."
Abstract
 Add to MetaCart
(Show Context)
1.1 Basic setup, random inputs • Given a random pair (X,Y) ∈ Rd × R, recall that the function f0(x) = E(Y X = x) is called the regression function (of Y on X). The basic goal in nonparametric regression is
INDUSTRIAL GEOMETRY Computer Aided Geometric Design Computer Vision CHARACTERIZATION OF MINIMIZERS OF CONVEX REGULARIZATION FUNCTIONALS
, 2006
"... Abstract. We study variational methods of bounded variation type for the data analysis. Y. Meyer characterized minimizers of the RudinOsherFatemi functional in dependence of the Gnorm of the data. These results and the follow up work on this topic are generalized to functionals defined on spaces ..."
Abstract
 Add to MetaCart
Abstract. We study variational methods of bounded variation type for the data analysis. Y. Meyer characterized minimizers of the RudinOsherFatemi functional in dependence of the Gnorm of the data. These results and the follow up work on this topic are generalized to functionals defined on spaces of functions with derivatives of finite bounded variation. In order to derive a characterization of minimizers of convex regularization functionals we use the concept of generalized directional derivatives and duality. Finally we present some examples where the minimizers of convex regularization functionals are calculated analytically, repeating some recent results from the literature and adding some novel results with penalization of higher order derivatives of bounded variation. 1.