Results 1  10
of
24
Automatic image retargeting
 In In the Mobile and Ubiquitous Multimedia (MUM), ACM
, 2005
"... Figure 1: Preserving functional realism rather than photorealism by image retargeting. (a) The source image containing three areas of higher importance, the two boys, and the ball. (b) The source image retargeted to fit a PDA display. (c) The source image retargeted to fit a cell phone display. In ..."
Abstract

Cited by 62 (3 self)
 Add to MetaCart
Figure 1: Preserving functional realism rather than photorealism by image retargeting. (a) The source image containing three areas of higher importance, the two boys, and the ball. (b) The source image retargeted to fit a PDA display. (c) The source image retargeted to fit a cell phone display. In the retargeted images, our algorithm is able to keep both boys in the image and maintain the relative positions of all shadows. 1
Solving Inverse Problems with Piecewise Linear Estimators: From Gaussian Mixture Models to Structured Sparsity
, 2010
"... A general framework for solving image inverse problems is introduced in this paper. The approach is based on Gaussian mixture models, estimated via a computationally efficient MAPEM algorithm. A dual mathematical interpretation of the proposed framework with structured sparse estimation is describe ..."
Abstract

Cited by 55 (8 self)
 Add to MetaCart
(Show Context)
A general framework for solving image inverse problems is introduced in this paper. The approach is based on Gaussian mixture models, estimated via a computationally efficient MAPEM algorithm. A dual mathematical interpretation of the proposed framework with structured sparse estimation is described, which shows that the resulting piecewise linear estimate stabilizes the estimation when compared to traditional sparse inverse problem techniques. This interpretation also suggests an effective dictionary motivated initialization for the MAPEM algorithm. We demonstrate that in a number of image inverse problems, including inpainting, zooming, and deblurring, the same algorithm produces either equal, often significantly better, or very small margin worse results than the best published ones, at a lower computational cost. 1 I.
A minimum squarederror framework for generalized sampling
 IEEE Trans. Sig. Proc
, 2006
"... We treat the problem of reconstructing a signal from its nonideal samples where the sampling and reconstruction spaces as well as the class of input signals can be arbitrary subspaces of a Hilbert space. Our formulation is general, and includes as special cases reconstruction from finitely many sam ..."
Abstract

Cited by 38 (23 self)
 Add to MetaCart
(Show Context)
We treat the problem of reconstructing a signal from its nonideal samples where the sampling and reconstruction spaces as well as the class of input signals can be arbitrary subspaces of a Hilbert space. Our formulation is general, and includes as special cases reconstruction from finitely many samples as well as uniformsampling of continuoustime signals, which are not necessarily bandlimited. To obtain a good approximation of the signal in the reconstruction space from its samples, we suggest two design strategies that attempt to minimize the squarednorm error between the signal and its reconstruction. The approaches we propose differ in their assumptions on the input signal: If the signal is known to lie in an appropriately chosen subspace, then we propose a method that achieves the minimal squarederror. On the other hand, when the signal is not restricted, we show that the minimalnorm reconstruction cannot generally be obtained. Instead, we suggest minimizing the worstcase squarederror between the reconstructed signal, and the best possible (but usually unattainable) approximation of the signal within the reconstruction space. We demonstrate both theoretically and through simulations that the suggested methods can outperform the consistent reconstruction approach previously proposed for this problem. 1
Variational image reconstruction from arbitrarily spaced samples: A fast multiresolution spline solution
 IEEE TRANSACTIONS ON IMAGE PROCESSING
, 2005
"... We propose a novel method for image reconstruction from nonuniform samples with no constraints on their locations. We adopt a variational approach where the reconstruction is formulated as the minimizer of a cost that is a weighted sum of two terms: 1) the sum of squared errors at the specified poin ..."
Abstract

Cited by 33 (3 self)
 Add to MetaCart
(Show Context)
We propose a novel method for image reconstruction from nonuniform samples with no constraints on their locations. We adopt a variational approach where the reconstruction is formulated as the minimizer of a cost that is a weighted sum of two terms: 1) the sum of squared errors at the specified points and 2) a quadratic functional that penalizes the lack of smoothness. We search for a solution that is a uniform spline and show how it can be determined by solving a large, sparse system of linear equations. We interpret the solution of our approach as an approximation of the analytical solution that involves radial basis functions and demonstrate the computational advantages of our approach. Using the twoscale relation for Bsplines, we derive an algebraic relation that links together the linear systems of equations specifying reconstructions at different levels of resolution. We use this relation to develop a fast multigrid algorithm. We demonstrate the effectiveness of our approach on some image reconstruction examples.
Linear interpolation revitalized
 IEEE Trans. Image Processing
, 2004
"... Abstract—We present a simple, original method to improve piecewiselinear interpolation with uniform knots: we shift the sampling knots by a fixed amount, while enforcing the interpolation property. We determine the theoretical optimal shift that maximizes the quality of our shifted linear interpol ..."
Abstract

Cited by 25 (2 self)
 Add to MetaCart
(Show Context)
Abstract—We present a simple, original method to improve piecewiselinear interpolation with uniform knots: we shift the sampling knots by a fixed amount, while enforcing the interpolation property. We determine the theoretical optimal shift that maximizes the quality of our shifted linear interpolation. Surprisingly enough, this optimal value is nonzero and close to 1 5. We confirm our theoretical findings by performing several experiments: a cumulative rotation experiment and a zoom experiment. Both show a significant increase of the quality of the shifted method with respect to the standard one. We also observe that, in these results, we get a quality that is similar to that of the computationally more costly “highquality ” cubic convolution. Index Terms—Approximation methods, error analysis, interpolation, piecewise linear approximation, recursive digital filters, spline functions. I.
Superresolution with sparse mixing estimators
 IEEE Trans on IP
, 2010
"... Abstract—We introduce a class of inverse problem estimators computed by mixing adaptively a family of linear estimators corresponding to different priors. Sparse mixing weights are calculated over blocks of coefficients in a frame providing a sparse signal representation. They minimize an norm takin ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
(Show Context)
Abstract—We introduce a class of inverse problem estimators computed by mixing adaptively a family of linear estimators corresponding to different priors. Sparse mixing weights are calculated over blocks of coefficients in a frame providing a sparse signal representation. They minimize an norm taking into account the signal regularity in each block. Adaptive directional image interpolations are computed over a wavelet frame with an algorithm, providing stateoftheart numerical results. Index Terms—Block matching pursuit, interpolation, inverse problem, mixing estimator, structured sparsity, superresolution, Tikhonov regularization, wavelet. I.
A survey of image retargeting techniques
, 2010
"... Advances in imaging technology have made the capture and display of digital images ubiquitous. A variety of displays are used to view them, ranging from highresolution computer monitors to lowresolution mobile devices, and images often have to undergo changes in size and aspect ratio to adapt to d ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
Advances in imaging technology have made the capture and display of digital images ubiquitous. A variety of displays are used to view them, ranging from highresolution computer monitors to lowresolution mobile devices, and images often have to undergo changes in size and aspect ratio to adapt to different screens. Also, displaying and printing documents with embedded images frequently entail resizing of the images to comply with the overall layout. Straightforward image resizing operators, such as scaling, often do not produce satisfactory results, since they are oblivious to image content. In this work, we review and categorize algorithms for contentaware image retargeting, i.e., resizing an image while taking its content into consideration to preserve important regions and minimize distortions. This is a challenging problem, as it requires preserving the relevant information while maintaining an aesthetically pleasing image for the user. The techniques typically start by computing an importance map which represents the relevance of every pixel, and then apply an operator that resizes the image while taking into account the importance map and additional constraints. We intend this review to be useful to researchers and practitioners interested in image retargeting.
A minimum squarederror framework for sampling and reconstruction in arbitrary spaces
 IEEE Trans. Signal Process
, 2006
"... We consider nonideal sampling and reconstruction schemes in which the sampling and reconstruction spaces as well as the input signal can be arbitrary. To obtain a good reconstruction of the signal in the reconstruction space from arbitrary samples, we suggest processing the samples prior to reconst ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
(Show Context)
We consider nonideal sampling and reconstruction schemes in which the sampling and reconstruction spaces as well as the input signal can be arbitrary. To obtain a good reconstruction of the signal in the reconstruction space from arbitrary samples, we suggest processing the samples prior to reconstruction with a linear transformation that is designed to minimize the worstcase squarednorm error between the reconstructed signal, and the best possible (but usually unattainable) approximation of the signal in the reconstruction space. If the input signal is known to lie in an appropriately chosen subspace, then we propose a linear transformation that achieves the minimal squarederror norm approximation. We show both theoretically and through simulations that if the input signal does not lie in the reconstruction space, then the suggested methods can outperform the consistent reconstruction method previously proposed for this problem. 1
Regularized image upsampling
, 2004
"... This thesis addresses the problem of performing image magnification to achieve higher perceived resolution for greyscale and color images. A new perspective on the problem is introduced through the new concept of a theoretical camera that can acquire an ideal high resolution image. A new formulati ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This thesis addresses the problem of performing image magnification to achieve higher perceived resolution for greyscale and color images. A new perspective on the problem is introduced through the new concept of a theoretical camera that can acquire an ideal high resolution image. A new formulation of the problem is then introduced using two ingredients: a newly designed observation model and the totalvariation regularizer. An observation model, that establishes a generalized relation between the desired magnified image and the measured lower resolution image, has been newly designed based on careful study of the physical acquisition processes that have generated the images. The result is a major contribution of this thesis: a closedform solution for obtaining the observation model. This closed form has been implemented and observation models were obtained for different typical scenarios, and their performance was shown to outperform observation models used in the literature. Two new theorems for designing the theoretical camera, adapted to the display device used, on arbitrary lattices have been developed. The thesis presents new analysis with a signal
REGULAR PAPER Perceptual image preview
"... Abstract Image preview is a convenient way to browse large or multiple images on small displays. However, current signallevel image resampling algorithms may remove many features of interest in the preview image. In this paper, we propose perceptual image preview which retains more perceptual featu ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract Image preview is a convenient way to browse large or multiple images on small displays. However, current signallevel image resampling algorithms may remove many features of interest in the preview image. In this paper, we propose perceptual image preview which retains more perceptual features such that users can inspect features of interest by viewing the preview image only and without zooming in. This technology has two components, structure enhancement and perceptual feature visualization. Structure enhancement enhances the image structure while suppressing subtle details using a gradient modulation method, thus making the succedent perceptual features more apparent. For perceptual feature visualization, features of interest detected in the picture is visualized on the structure enhanced preview image. We demonstrate with two examples of most commonly used image quality features, image blur and noise. The effectiveness of the proposed method is validated by experimental results.