Results 1  10
of
22
A signal processing approach to generalized 1D total variation
 IEEE Trans. Signal Process
, 2011
"... Abstract—Total variation (TV) is a powerful method that brings great benefit for edgepreserving regularization. Despite being widely employed in image processing, it has restricted applicability for 1D signal processing since piecewiseconstant signals form a rather limited model for many applicat ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
Abstract—Total variation (TV) is a powerful method that brings great benefit for edgepreserving regularization. Despite being widely employed in image processing, it has restricted applicability for 1D signal processing since piecewiseconstant signals form a rather limited model for many applications. Here we generalize conventional TV in 1D by extending the derivative operator, which is within the regularization term, to any linear differential operator. This provides flexibility for tailoring the approach to the presence of nontrivial linear systems and for different types of driving signals such as spikelike, piecewiseconstant, and so on. Conventional TV remains a special case of this general framework. We illustrate the feasibility of the method by considering a nontrivial linear system and different types of driving signals. Index Terms—Differential operators, linear systems, regularization, sparsity, total variation. I.
Leftinverses of fractional Laplacian and sparse stochastic processes
, 2012
"... The fractional Laplacian (−△) γ/2 commutes with the primary coordination transformations in the Euclidean space Rd:dilation,translation and rotation, and has tight link to splines, fractals and stable Levy processes. For 0 <γ <d, itsinverseistheclassicalRieszpotentialIγ which is dilationinvar ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
(Show Context)
The fractional Laplacian (−△) γ/2 commutes with the primary coordination transformations in the Euclidean space Rd:dilation,translation and rotation, and has tight link to splines, fractals and stable Levy processes. For 0 <γ <d, itsinverseistheclassicalRieszpotentialIγ which is dilationinvariant and translationinvariant. In this work, we investigate the functional properties (continuity, decay and invertibility) of an extended class of differential operators that share those invariance properties. In particular, we extend the definition of the classical Riesz potential Iγ to any noninteger number γ larger than d and show that it is the unique leftinverse of the fractional Laplacian (−△) γ/2 which is dilationinvariant and translationinvariant. We observe that, for any 1 ≤ p ≤∞and γ ≥ d(1 − 1/p), there exists a Schwartz function f such that Iγ f is not pintegrable. We then introduce the new unique leftinverse Iγ,p of the fractional Laplacian (−△) γ/2 with the property that Iγ,p is dilationinvariant (but not translationinvariant) and that Iγ,p f is pintegrable for any Schwartz function f. We finally apply that linear operator Iγ,p with p = 1 to solve the stochastic partial differential equation (−△) γ/2 � = w with white Poisson noise as its driving term w.
Bayesian Estimation for ContinuousTime Sparse Stochastic Processes
"... Abstract—We consider continuoustime sparse stochastic processes from which we have only a finite number of noisy/noiseless samples. Our goal is to estimate the noiseless samples (denoising) and the signal inbetween (interpolation problem). By relying on tools from the theory of splines, we derive ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
(Show Context)
Abstract—We consider continuoustime sparse stochastic processes from which we have only a finite number of noisy/noiseless samples. Our goal is to estimate the noiseless samples (denoising) and the signal inbetween (interpolation problem). By relying on tools from the theory of splines, we derive the joint a priori distribution of the samples and show how this probability density function can be factorized. The factorization enables us to tractably implement the maximum a posteriori and minimum meansquare error (MMSE) criteria as two statistical approaches for estimating the unknowns. We compare the derived statistical methods with wellknown techniques for the recovery of sparse signals, such as the norm and Log ( relaxation) regularization methods. The simulation results show that, under certain conditions, the performance of the regularization techniques can be very close to that of the MMSE estimator.
Reconstruction of biomedical images and sparse stochastic modelling
 in Proc.Int.Symp. Biomedical Imaging
, 2012
"... We propose a novel statistical formulation of the imagereconstruction problem from noisy linear measurements. We derive an extended family of MAP estimators based on the theory of continuousdomain sparse stochastic processes. We highlight the crucial roles of the whitening operator and of the Lév ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
(Show Context)
We propose a novel statistical formulation of the imagereconstruction problem from noisy linear measurements. We derive an extended family of MAP estimators based on the theory of continuousdomain sparse stochastic processes. We highlight the crucial roles of the whitening operator and of the Lévy exponent of the innovations which controls the sparsity of the model. While our family of estimators includes the traditional methods of Tikhonov and totalvariation (TV) regularization as particular cases, it opens the door to a much broader class of potential functions (associated with infinitely divisible priors) that are inherently sparse and typically nonconvex. We also provide an algorithmic scheme—naturally suggested by our framework—that can handle arbitrary potential functions. Further, we consider the reconstruction of simulated MRI data and illustrate that the designed estimators can bring significant improvement in reconstruction performance. Index Terms — sparse stochastic processes, statistical estimation, sparsitypromoting regularization, nonconvex optimization. 1.
A unified formulation of Gaussian versus sparse stochastic processes  Part I: Continuousdomain theory
 IEEE Trans. Information Theory
, 2014
"... Abstract — We introduce a general distributional framework that results in a unifying description and characterization of a rich variety of continuoustime stochastic processes. The cornerstone of our approach is an innovation model that is driven by some generalized white noise process, which may b ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Abstract — We introduce a general distributional framework that results in a unifying description and characterization of a rich variety of continuoustime stochastic processes. The cornerstone of our approach is an innovation model that is driven by some generalized white noise process, which may be Gaussian or not (e.g., Laplace, impulsive Poisson, or alpha stable). This allows for a conceptual decoupling between the correlation properties of the process, which are imposed by the whitening operator L, and its sparsity pattern, which is determined by the type of noise excitation. The latter is fully specified by a Lévy measure. We show that the range of admissible innovation behavior varies between the purely Gaussian and supersparse extremes. We prove that the corresponding generalized stochastic processes are welldefined mathematically provided that the (adjoint) inverse of the whitening operator satisfies some L p bound for p ≥ 1. We present a novel operatorbased method that yields an explicit characterization of all Lévydriven processes that are solutions of constantcoefficient stochastic differential equations. When the underlying system is stable, we recover the family of stationary continuoustime autoregressive moving average processes (CARMA), including the Gaussian ones. The approach remains valid when the system is unstable and leads to the identification of potentially useful generalizations of the Lévy processes, which are sparse and nonstationary. Finally, we show that these processes admit a sparse representation in some matched wavelet domain and provide a full characterization of their transformdomain statistics. Index Terms — Sparsity, nonGaussian stochastic processes, innovation modeling, continuoustime signals, stochastic differential equations, wavelet expansion, Lévy process, infinite divisibility I.
BAYESIAN DENOISING OF GENERALIZED POISSON PROCESSES WITH FINITE RATE OF INNOVATION
"... We investigate the problem of the optimal reconstruction of a generalized Poisson process from its noisy samples. The process is known to have a finite rate of innovation since it is generated by a random stream of Diracs with a finite average number of impulses per unit interval. We formulate the ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
We investigate the problem of the optimal reconstruction of a generalized Poisson process from its noisy samples. The process is known to have a finite rate of innovation since it is generated by a random stream of Diracs with a finite average number of impulses per unit interval. We formulate the recovery problem in a Bayesian framework and explicitly derive the joint probability density function (pdf) of the sampled signal. We compare the performance of the optimal Minimum Mean Square Error (MMSE) estimator with common regularization techniques such as 1 and Log penalty functions. The simulation results indicate that, under certain conditions, the regularization techniques can achieve a performance close to the MMSE method.
On the Continuity of Characteristic Functionals and Sparse Stochastic Modeling
, 2014
"... Abstract. The characteristic functional is the infinitedimensional generalization of the Fourier transform for measures on function spaces. It characterizes the statistical law of the associated stochastic process in the same way as a characteristic function specifies the probability distribution ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The characteristic functional is the infinitedimensional generalization of the Fourier transform for measures on function spaces. It characterizes the statistical law of the associated stochastic process in the same way as a characteristic function specifies the probability distribution of its corresponding random variable. Our goal in this work is to lay the foundations of the innovation model, a (possibly) nonGaussian probabilistic model for sparse signals. This is achieved by using the characteristic functional to specify sparse stochastic processes that are defined as linear transformations of general continuousdomain white Lévy noises (also called innovation processes). We prove the existence of a broad class of sparse processes by using the MinlosBochner theorem. This requires a careful study of the regularity properties, especially the Lpboundedness, of the characteristic functional of the innovations. We are especially interested in the functionals that are only defined for p < 1 since they appear to be associated with the sparser kind of processes. Finally, we apply our main theorem of existence to two specific subclasses of processes with specific invariance properties.
MMSE ESTIMATION OF SPARSE LÉVY PROCESSES 1 MMSE Estimation of Sparse Lévy Processes
"... Abstract—We investigate a stochastic signalprocessing framework for signals with sparse derivatives, where the samples of a Lévy process are corrupted by noise. The proposed signal model covers the wellknown Brownian motion and piecewiseconstant Poisson process; moreover, the Lévy family also ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—We investigate a stochastic signalprocessing framework for signals with sparse derivatives, where the samples of a Lévy process are corrupted by noise. The proposed signal model covers the wellknown Brownian motion and piecewiseconstant Poisson process; moreover, the Lévy family also contains other interesting members exhibiting heavytail statistics that fulfill the requirements of compressibility. We characterize the maximumaposteriori probability (MAP) and minimum meansquare error (MMSE) estimators for such signals. Interestingly, some of the MAP estimators for the Lévy model coincide with popular signaldenoising algorithms (e.g., totalvariation (TV) regularization). We propose a novel noniterative implementation of the MMSE estimator based on the beliefpropagation algorithm performed in the Fourier domain. Our algorithm takes advantage of the fact that the joint statistics of general Lévy processes are much easier to describe by their characteristic function, as the probability densities do not always admit closedform expressions. We then use our new estimator as a benchmark to compare the performance of existing algorithms for the optimal recovery of gradientsparse signals. Index Terms—Lévy process, stochastic modeling, sparsesignal estimation, non linear reconstruction, totalvariation estimation, belief propagation (BP), message passing. I.
1Bayesian Estimation for ContinuousTime Sparse Stochastic Processes
"... Abstract—We consider continuoustime sparse stochastic processes from which we have only a finite number of noisy/noiseless samples. Our goal is to estimate the noiseless samples (denoising) and the signal inbetween (interpolation problem). By relying on tools from the theory of splines, we derive ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—We consider continuoustime sparse stochastic processes from which we have only a finite number of noisy/noiseless samples. Our goal is to estimate the noiseless samples (denoising) and the signal inbetween (interpolation problem). By relying on tools from the theory of splines, we derive the joint a priori distribution of the samples and show how this probability density function can be factorized. The factorization enables us to tractably implement the maximum a posteriori and minimum meansquare error (MMSE) criteria as two statistical approaches for estimating the unknowns. We compare the derived statistical methods with wellknown techniques for the recovery of sparse signals, such as the `1 norm and Log (`1`0 relaxation) regularization methods. The simulation results show that, under certain conditions, the performance of the regularization techniques can be very close to that of the MMSE estimator.