Results 1  10
of
79
Image registration methods: a survey
 IMAGE AND VISION COMPUTING
, 2003
"... This paper aims to present a review of recent as well as classic image registration methods. Image registration is the process of overlaying images (two or more) of the same scene taken at different times, from different viewpoints, and/or by different sensors. The registration geometrically align t ..."
Abstract

Cited by 734 (9 self)
 Add to MetaCart
(Show Context)
This paper aims to present a review of recent as well as classic image registration methods. Image registration is the process of overlaying images (two or more) of the same scene taken at different times, from different viewpoints, and/or by different sensors. The registration geometrically align two images (the reference and sensed images). The reviewed approaches are classified according to their nature (areabased and featurebased) and according to four basic steps of image registration procedure: feature detection, feature matching, mapping function design, and image transformation and resampling. Main contributions, advantages, and drawbacks of the methods are mentioned in the paper. Problematic issues of image registration and outlook for the future research are discussed too. The major goal of the paper is to provide a comprehensive reference source for the researchers involved in image registration, regardless of particular application areas.
Splines: A Perfect Fit for Signal/Image Processing
 IEEE SIGNAL PROCESSING MAGAZINE
, 1999
"... ..."
(Show Context)
Sampling—50 years after Shannon
 Proceedings of the IEEE
, 2000
"... This paper presents an account of the current state of sampling, 50 years after Shannon’s formulation of the sampling theorem. The emphasis is on regular sampling, where the grid is uniform. This topic has benefited from a strong research revival during the past few years, thanks in part to the math ..."
Abstract

Cited by 340 (27 self)
 Add to MetaCart
(Show Context)
This paper presents an account of the current state of sampling, 50 years after Shannon’s formulation of the sampling theorem. The emphasis is on regular sampling, where the grid is uniform. This topic has benefited from a strong research revival during the past few years, thanks in part to the mathematical connections that were made with wavelet theory. To introduce the reader to the modern, Hilbertspace formulation, we reinterpret Shannon’s sampling procedure as an orthogonal projection onto the subspace of bandlimited functions. We then extend the standard sampling paradigm for a representation of functions in the more general class of “shiftinvariant” functions spaces, including splines and wavelets. Practically, this allows for simpler—and possibly more realistic—interpolation models, which can be used in conjunction with a much wider class of (antialiasing) prefilters that are not necessarily ideal lowpass. We summarize and discuss the results available for the determination of the approximation error and of the sampling rate when the input of the system is essentially arbitrary; e.g., nonbandlimited. We also review variations of sampling that can be understood from the same unifying perspective. These include wavelets, multiwavelets, Papoulis generalized sampling, finite elements, and frames. Irregular sampling and radial basis functions are briefly mentioned. Keywords—Bandlimited functions, Hilbert spaces, interpolation, least squares approximation, projection operators, sampling,
A chronology of interpolation: From ancient astronomy to modern signal and image processing
 Proceedings of the IEEE
, 2002
"... This paper presents a chronological overview of the developments in interpolation theory, from the earliest times to the present date. It brings out the connections between the results obtained in different ages, thereby putting the techniques currently used in signal and image processing into histo ..."
Abstract

Cited by 102 (0 self)
 Add to MetaCart
(Show Context)
This paper presents a chronological overview of the developments in interpolation theory, from the earliest times to the present date. It brings out the connections between the results obtained in different ages, thereby putting the techniques currently used in signal and image processing into historical perspective. A summary of the insights and recommendations that follow from relatively recent theoretical as well as experimental studies concludes the presentation. Keywords—Approximation, convolutionbased interpolation, history, image processing, polynomial interpolation, signal processing, splines. “It is an extremely useful thing to have knowledge of the true origins of memorable discoveries, especially those that have been found not by accident but by dint of meditation. It is not so much that thereby history may attribute to each man his own discoveries and others should be encouraged to earn like commendation, as that the art of making discoveries should be extended by considering noteworthy examples of it. ” 1 I.
Image superresolution using gradient profile prior
, 2008
"... In this paper, we propose an image superresolution approach using a novel generic image prior – gradient profile prior, which is a parametric prior describing the shape and the sharpness of the image gradients. Using the gradient profile prior learned from a large number of natural images, we can p ..."
Abstract

Cited by 69 (4 self)
 Add to MetaCart
(Show Context)
In this paper, we propose an image superresolution approach using a novel generic image prior – gradient profile prior, which is a parametric prior describing the shape and the sharpness of the image gradients. Using the gradient profile prior learned from a large number of natural images, we can provide a constraint on image gradients when we estimate a hiresolution image from a lowresolution image. With this simple but very effective prior, we are able to produce stateoftheart results. The reconstructed hiresolution image is sharp while has rare ringing or jaggy artifacts.
Generalized smoothing splines and the optimal discretization of the Wiener filter
 IEEE Trans. Signal Process
, 2005
"... Abstract—We introduce an extended class of cardinal L Lsplines, where L is a pseudodifferential operator satisfying some admissibility conditions. We show that the L Lspline signal interpolation problem is well posed and that its solution is the unique minimizer of the spline energy functional L ..."
Abstract

Cited by 43 (24 self)
 Add to MetaCart
(Show Context)
Abstract—We introduce an extended class of cardinal L Lsplines, where L is a pseudodifferential operator satisfying some admissibility conditions. We show that the L Lspline signal interpolation problem is well posed and that its solution is the unique minimizer of the spline energy functional L P, subject to the interpolation constraint. Next, we consider the corresponding regularized least squares estimation problem, which is more appropriate for dealing with noisy data. The criterion to be minimized is the sum of a quadratic data term, which forces the solution to be close to the input samples, and a “smoothness” term that privileges solutions with small spline energies. Here, too, we find that the optimal solution, among all possible functions, is a cardinal L Lspline. We show that this smoothing spline estimator has a stable representation in a Bsplinelike basis and that its coefficients can be computed by digital filtering of the input signal. We describe an efficient recursive filtering algorithm that is applicable whenever the transfer function of L is rational (which corresponds to the case of exponential splines). We justify these algorithms statistically by establishing an equivalence between L L smoothing splines and the minimum mean square error (MMSE) estimation of a stationary signal corrupted by white Gaussian noise. In this modelbased formulation, the optimum operator L is the whitening filter of the process, and the regularization parameter is proportional to the noise variance. Thus, the proposed formalism yields the optimal discretization of the classical Wiener filter, together with a fast recursive algorithm. It extends the standard Wiener solution by providing the optimal interpolation space. We also present a Bayesian interpretation of the algorithm. Index Terms—Nonparametric estimation, recursive filtering, smoothing splines, splines (polynomial and exponential), stationary processes, variational principle, Wiener filter. I.
Compressive coded aperture video reconstruction
 IN EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO
, 2008
"... This paper concerns compressive sensing methods for overcoming the pixellimited resolution of digital video imaging systems. Recent developments in coded aperture mask designs have led to the reconstruction of static images from a single, lowresolution, noisy observation image. Our methods apply t ..."
Abstract

Cited by 42 (5 self)
 Add to MetaCart
(Show Context)
This paper concerns compressive sensing methods for overcoming the pixellimited resolution of digital video imaging systems. Recent developments in coded aperture mask designs have led to the reconstruction of static images from a single, lowresolution, noisy observation image. Our methods apply these coded mask designs to each video frame and use compressive sensing optimization techniques for enhanced resolution digital video recovery. We demonstrate that further improvements can be attained by solving for multiple frames simultaneously, even when the total computation time budget is held fixed.
Image and Video Upscaling from Local SelfExamples
"... We propose a new highquality and efficient singleimage upscaling technique that extends existing examplebased superresolution frameworks. In our approach we do not rely on an external example database or use the whole input image as a source for example patches. Instead, we follow a local selfs ..."
Abstract

Cited by 39 (0 self)
 Add to MetaCart
We propose a new highquality and efficient singleimage upscaling technique that extends existing examplebased superresolution frameworks. In our approach we do not rely on an external example database or use the whole input image as a source for example patches. Instead, we follow a local selfsimilarity assumption on natural images and extract patches from extremely localized regions in the input image. This allows us to reduce considerably the nearestpatch search time without compromising quality in most images. Tests, that we perform and report, show that the localself similarity assumption holds better for small scaling factors where there are more example patches of greater relevance. We implement these small scalings using dedicated novel nondyadic filter banks, that we derive based on principles that model the upscaling process. Moreover, the new filters are nearlybiorthogonal and hence produce highresolution images that are highly consistent with the input image without solving implicit backprojection equations. The local and explicit nature of our algorithm makes it simple, efficient and allows a trivial parallel implementation on a GPU. We demonstrate the new method ability to produce highquality resolution enhancement, its application to video sequences with no algorithmic modification, and its efficiency to perform realtime enhancement of lowresolution video standard into recent highdefinition formats.
LeastSquares Image Resizing Using Finite Differences
, 2001
"... We present an optimal splinebased algorithm for the enlargement or reduction of digital images with arbitrary (noninteger) scaling factors. This projectionbased approach can be realized thanks to a new finite difference method that allows the computation of inner products with analysis functions t ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
We present an optimal splinebased algorithm for the enlargement or reduction of digital images with arbitrary (noninteger) scaling factors. This projectionbased approach can be realized thanks to a new finite difference method that allows the computation of inner products with analysis functions that are Bsplines of any degree . A noteworthy property of the algorithm is that the computational complexity per pixel does not depend on the scaling factor . For a given choice of basis functions, the results of our method are consistently better than those of the standard interpolation procedure; the present scheme achieves a reduction of artifacts such as aliasing and blocking and a significant improvement of the signaltonoise ratio. The method can be generalized to include other classes of piecewise polynomial functions, expressed as linear combinations of Bsplines and their derivatives.
Fourdimensional cardiac imaging in living embryos via postacquisition synchronization of nongated slice sequences
 JOURNAL OF BIOMEDICAL OPTICS 10(5)
, 2005
"... ..."