Results 11  20
of
43
A signal processing approach to generalized 1D total variation
 IEEE Trans. Signal Process
, 2011
"... Abstract—Total variation (TV) is a powerful method that brings great benefit for edgepreserving regularization. Despite being widely employed in image processing, it has restricted applicability for 1D signal processing since piecewiseconstant signals form a rather limited model for many applicat ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
Abstract—Total variation (TV) is a powerful method that brings great benefit for edgepreserving regularization. Despite being widely employed in image processing, it has restricted applicability for 1D signal processing since piecewiseconstant signals form a rather limited model for many applications. Here we generalize conventional TV in 1D by extending the derivative operator, which is within the regularization term, to any linear differential operator. This provides flexibility for tailoring the approach to the presence of nontrivial linear systems and for different types of driving signals such as spikelike, piecewiseconstant, and so on. Conventional TV remains a special case of this general framework. We illustrate the feasibility of the method by considering a nontrivial linear system and different types of driving signals. Index Terms—Differential operators, linear systems, regularization, sparsity, total variation. I.
Bayesian Estimation for ContinuousTime Sparse Stochastic Processes
"... Abstract—We consider continuoustime sparse stochastic processes from which we have only a finite number of noisy/noiseless samples. Our goal is to estimate the noiseless samples (denoising) and the signal inbetween (interpolation problem). By relying on tools from the theory of splines, we derive ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
(Show Context)
Abstract—We consider continuoustime sparse stochastic processes from which we have only a finite number of noisy/noiseless samples. Our goal is to estimate the noiseless samples (denoising) and the signal inbetween (interpolation problem). By relying on tools from the theory of splines, we derive the joint a priori distribution of the samples and show how this probability density function can be factorized. The factorization enables us to tractably implement the maximum a posteriori and minimum meansquare error (MMSE) criteria as two statistical approaches for estimating the unknowns. We compare the derived statistical methods with wellknown techniques for the recovery of sparse signals, such as the norm and Log ( relaxation) regularization methods. The simulation results show that, under certain conditions, the performance of the regularization techniques can be very close to that of the MMSE estimator.
Generalized Lspline wavelet bases
 in Proceedings of the SPIE Conference on Mathematical Imaging: Wavelet XI
"... We build waveletlike functions based on a parametrized family of pseudodifferential operators L~ν that satisfy some admissibility and scalability conditions. The shifts of the generalized Bsplines, which are localized versions of the Green function of L~ν, generate a family of Lspline spaces. Th ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
We build waveletlike functions based on a parametrized family of pseudodifferential operators L~ν that satisfy some admissibility and scalability conditions. The shifts of the generalized Bsplines, which are localized versions of the Green function of L~ν, generate a family of Lspline spaces. These spaces have the approximation order equal to the order of the underlying operator. A sequence of embedded spaces is obtained by choosing a dyadic scale progression a = 2i. The consecutive inclusion of the spaces yields the refinement equation, where the scaling filter depends on scale. The generalized Lwavelets are then constructed as basis functions for the orthogonal complements of spline spaces. The vanishing moment property of conventional wavelets is generalized to the vanishing null space element property. In spite of the scale dependence of the filters, the wavelet decomposition can be performed using an adapted version of Mallat’s filterbank algorithm.
Regularized Interpolation for Noisy Images
"... Abstract—Interpolation is the means by which a continuously defined model is fit to discrete data samples. When the data samples are exempt of noise, it seems desirable to build the model by fitting them exactly. In medical imaging, where quality is of paramount importance, this ideal situation unfo ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Abstract—Interpolation is the means by which a continuously defined model is fit to discrete data samples. When the data samples are exempt of noise, it seems desirable to build the model by fitting them exactly. In medical imaging, where quality is of paramount importance, this ideal situation unfortunately does not occur. In this paper, we propose a scheme that improves on the quality by specifying a tradeoff between fidelity to the data and robustness to the noise. We resort to variational principles, which allow us to impose smoothness constraints on the model for tackling noisy data. Based on shift, rotation, and scaleinvariant requirements on the model, we show that thenorm of an appropriate vector derivative is the most suitable choice of regularization for this purpose. In addition to Tikhonovlike quadratic regularization, this includes edgepreserving totalvariationlike (TV) regularization. We give algorithms to recover the continuously defined model from noisy samples and also provide a datadriven scheme to determine the optimal amount of regularization. We validate our method with numerical examples where we demonstrate its superiority over an exact fit as well as the benefit of TVlike nonquadratic regularization over Tikhonovlike quadratic regularization. Index Terms—Interpolation, regularization, regularization parameter, splines, Tikhonov functional, totalvariation functional.
Multisource data fusion for bandlimited signals: a Bayesian perspective 1
"... Abstract. We consider data fusion as the reconstruction of a single model from multiple data sources. The model is to be inferred from a number of blurred and noisy observations, possibly from different sensors under various conditions. It is all about recovering a compound object, signal+uncertaint ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We consider data fusion as the reconstruction of a single model from multiple data sources. The model is to be inferred from a number of blurred and noisy observations, possibly from different sensors under various conditions. It is all about recovering a compound object, signal+uncertainties, that best relates to the observations and contains all the useful information from the initial data set. We wish to provide a flexible framework for bandlimited signal reconstruction from multiple data. In this paper, we focus on a general approach involving forward modeling (prior model, data acquisition) and Bayesian inference. The proposed method is valid for nD objects (signals, images or volumes) with multidimensional spatial elements. For the sake of clarity, both formalism and test results will be shown in 1D for single band signals. The main originality lies in seeking an object with a prescribed bandwidth, hence our choice of a BSpline representation. This ensures an optimal sampling in both signal and frequency spaces, and allows for a shift invariant processing. The model resolution, the geometric distortions, the blur and the regularity of the sampling grid can be arbitrary for each sensor. The method is designed to handle realistic Gauss+Poisson noise. We obtained promising results in reconstructing a superresolved signal from two blurred and noisy shifted observations, using a Gaussian Markov chain as a prior. Practical applications are under development within the SpaceFusion project. For instance, in astronomical imaging, we aim at a sharp, wellsampled, noisefree and possibly superresolved image. Virtual Observatories could benefit from such a way to combine large numbers of multispectral images from various sources. In planetary imaging or remote sensing, a 3D image formation model is needed; nevertheless, this can be addressed within the same framework. Keywords: Modelbased data fusion, uncertainties, generative models, inverse problems, signal reconstruction, superresolution, resampling, BSplines 1.
Noninvertible Gabor Transforms
"... Abstract—Timefrequency analysis, such as the Gabor transform, plays an important role in many signal processing applications. The redundancy of such representations is often directly related to the computational load of any algorithm operating in the transform domain. To reduce complexity, it may b ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Abstract—Timefrequency analysis, such as the Gabor transform, plays an important role in many signal processing applications. The redundancy of such representations is often directly related to the computational load of any algorithm operating in the transform domain. To reduce complexity, it may be desirable to increase the time and frequency sampling intervals beyond the point where the transform is invertible, at the cost of an inevitable recovery error. In this paper we initiate the study of recovery procedures for noninvertible Gabor representations. We propose using fixed analysis and synthesis windows, chosen e.g., according to implementation constraints, and to process the Gabor coefficients prior to synthesis in order to shape the reconstructed signal. We develop three methods for signal recovery. The first follows from the consistency requirement, namely that the recovered signal has the same Gabor representation as the input signal. The second, is based on minimization of a worstcase error. Last, we develop a recovery technique based on the assumption that the input signal lies in some subspace of L2. We show that for each of the criteria, the manipulation of the transform coefficients amounts to a 2D twisted convolution, which we show how to perform using a filterbank. When the undersampling factor is an integer, the processing reduces to standard 2D convolution. We provide simulation results demonstrating the advantages and weaknesses of each of the algorithms. Index Terms — Gabor transform, sampling, twisted convolution.
Construction of wavelet bases that mimic the behaviour of some given operator
 in Proc. SPIE Conf. Mathematical Imaging: Wavelet XII
"... Probably the most important property of wavelets for signal processing is their multiscale derivativelike behavior when applied to functions. In order to extend the class of problems that can profit of waveletbased techniques, we propose to build new families of wavelets that behave like an arbitr ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Probably the most important property of wavelets for signal processing is their multiscale derivativelike behavior when applied to functions. In order to extend the class of problems that can profit of waveletbased techniques, we propose to build new families of wavelets that behave like an arbitrary scalecovariant operator. Our extension is general and includes many known wavelet bases. At the same time, the method takes advantage a fast filterbank decompositionreconstruction algorithm. We give necessary conditions for the scalecovariant operator to admit our wavelet construction, and we provide examples of new wavelets that can be obtained with our method.
On the Linearity of Bayesian Interpolators for NonGaussian ContinuousTime AR(1) Processes
"... Abstract—Bayesian estimation problems involving Gaussian distributions often result in linear estimation techniques. Nevertheless, there are no general statements as to whether the linearity of the Bayesian estimator is restricted to the Gaussian case. The two common strategies for nonGaussian mode ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract—Bayesian estimation problems involving Gaussian distributions often result in linear estimation techniques. Nevertheless, there are no general statements as to whether the linearity of the Bayesian estimator is restricted to the Gaussian case. The two common strategies for nonGaussian models are either finding the best linear estimator or numerically evaluating the Bayesian estimator by Monte Carlo methods. In this paper, we focus on Bayesian interpolation of nonGaussian firstorder autoregressive (AR) processes where the driving innovation can admit any symmetric infinitely divisible distribution characterized by the Lévy–Khintchine representation theorem. We redefine the Bayesian estimation problem in the Fourier domain with the help of characteristic forms. By providing analytic expressions, we show that the optimal interpolator is linear for all symmetricstable distributions. The Bayesian interpolator can be expressed in a convolutive form where the kernel is described in terms of exponential splines. We also show that the limiting case of Lévytype AR(1) processes, the system of which has a pole at the origin, always corresponds to a linear Bayesian interpolator made of a piecewise linear spline, irrespective of the innovation distribution. Finally, we show the two mentioned cases to be the only ones within the family for which the Bayesian interpolator is linear. Index Terms—Alphastable innovation, autoregressive, Bayesian estimator, interpolation, Ornstein–Uhlenbeck process.
Beyond bandlimited sampling  A review of nonlinearities, smoothness, and sparsity
, 2009
"... Digital applications have developed rapidly over the last few decades. Since many sources of information are of analog or continuoustime nature, discretetime signal processing (DSP) inherently relies on sampling a continuoustime signal to obtain a discretetime representation. Consequently, sampl ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Digital applications have developed rapidly over the last few decades. Since many sources of information are of analog or continuoustime nature, discretetime signal processing (DSP) inherently relies on sampling a continuoustime signal to obtain a discretetime representation. Consequently, sampling theories lie at the heart of signal processing devices and communication systems. Examples include sampling rate conversion for software radio [1] and between audio formats [2], biomedical imaging [3], lens distortion correction and the formation of image mosaics [4], and superresolution of image sequences [5]. To accommodate high operating rates while retaining low