Results 1  10
of
188
A Singular Value Thresholding Algorithm for Matrix Completion
, 2008
"... This paper introduces a novel algorithm to approximate the matrix with minimum nuclear norm among all matrices obeying a set of convex constraints. This problem may be understood as the convex relaxation of a rank minimization problem, and arises in many important applications as in the task of reco ..."
Abstract

Cited by 555 (22 self)
 Add to MetaCart
toimplement algorithm that is extremely efficient at addressing problems in which the optimal solution has low rank. The algorithm is iterative and produces a sequence of matrices {X k, Y k} and at each step, mainly performs a softthresholding operation on the singular values of the matrix Y k. There are two
Distributed soft thresholding for sparse signal recovery
 CoRR
"... Abstract—In this paper, we address the problem of distributed sparse recovery of signals acquired via compressed measurements in a sensor network. We propose a new class of distributed algorithms to solve Lasso regression problems, when the communication to a fusion center is not possible, e.g., du ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
.g., due to communication cost or privacy reasons. More precisely, we introduce a distributed iterative soft thresholding algorithm (DISTA) that consists of three steps: an averaging step, a gradient step, and a soft thresholding operation. We prove the convergence of DISTA in networks represented
Iterative reconstruction algorithm for nonlinear operators
"... Iterative soft thresholding of a models wavelet coefficients can be used to obtain models that are sparse with respect to a known basis function. We generate sparse models for nonlinear forward operators by applying the soft thresholding operator to the model obtained through a GaussNewton iteratio ..."
Abstract
 Add to MetaCart
Iterative soft thresholding of a models wavelet coefficients can be used to obtain models that are sparse with respect to a known basis function. We generate sparse models for nonlinear forward operators by applying the soft thresholding operator to the model obtained through a Gauss
Nonparametric Regression and Classification with Joint Sparsity Constraints
"... We propose new families of models and algorithms for highdimensional nonparametric learning with joint sparsity constraints. Our approach is based on a regularization method that enforces common sparsity patterns across different function components in a nonparametric additive model. The algorithms ..."
Abstract

Cited by 18 (3 self)
 Add to MetaCart
. The algorithms employ a coordinate descent approach that is based on a functional softthresholding operator. The framework yields several new models, including multitask sparse additive models, multiresponse sparse additive models, and sparse additive multicategory logistic regression. The methods
Asymptotic analysis of MAP estimation via the replica method and applications to compressed sensing
, 2009
"... The replica method is a nonrigorous but widelyaccepted technique from statistical physics used in the asymptotic analysis of large, random, nonlinear problems. This paper applies the replica method to nonGaussian maximum a posteriori (MAP) estimation. It is shown that with random linear measureme ..."
Abstract

Cited by 77 (9 self)
 Add to MetaCart
estimators used in compressed sensing, including basis pursuit, lasso, linear estimation with thresholding, and zero normregularized estimation. In the case of lasso estimation the scalar estimator reduces to a softthresholding operator, and for zero normregularized estimation it reduces to a hardthreshold
Image Feature Extraction by Sparse Coding and Independent Component Analysis
 In Proc. Int. Conf. on Pattern Recognition (ICPR'98
, 1998
"... Sparse coding is a method for finding a representation of data in which each of the components of the representation is only rarely significantly active. Such a representation is closely related to the techniques of independent component analysis and blind source separation. In this paper, we invest ..."
Abstract

Cited by 22 (3 self)
 Add to MetaCart
investigate the application of sparse coding for image feature extraction. We show how sparse coding can be used to extract waveletlike features from natural image data. As an application of such a feature extraction scheme, we show how to apply a softthresholding operator on the components of sparse coding
Variable Selection for SVM via Smoothing Spline ANOVA
"... It is wellknown that the support vector machine paradigm is equivalent to solving a regularization problem in a reproducing kernel Hilbert space. The squared norm penalty in the standard support vector machine controls the smoothness of the classification function. We propose, under the framewor ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
the framework of smoothing spline ANOVA models, a new type of regularization to conduct simultaneous classification and variable selection in the SVM. The penalty functional used is the sum of functional component norms, which automatically applies softthresholding operations to functional components hence
VARIABLE SELECTION FOR SUPPORT VECTOR MACHINES VIA SMOOTHING SPLINE ANOVA
, 2006
"... Abstract: It is wellknown that the support vector machine paradigm is equivalent to solving a regularization problem in a reproducing kernel Hilbert space. The squared norm penalty in the standard support vector machine controls the smoothness of the classification function. We propose, under the ..."
Abstract
 Add to MetaCart
the framework of smoothing spline ANOVA models, a new type of regularization to conduct simultaneous classification and variable selection in the SVM. The penalty functional used is the sum of functional component norms, which automatically applies softthresholding operations to functional components, hence
A fast thresholded Landweber algorithm for waveletregularized multidimensional deconvolution
 IEEE Trans. Image Process
, 2008
"... Abstract—We present a fast variational deconvolution algorithm that minimizes a quadratic data term subject to a regularization on the 1norm of the wavelet coefficients of the solution. Previously available methods have essentially consisted in alternating between a Landweber iteration and a wavele ..."
Abstract

Cited by 40 (7 self)
 Add to MetaCart
waveletdomain softthresholding operation. While having the advantage of simplicity, they are known to converge slowly. By expressing the cost functional in a Shannon wavelet basis, we are able to decompose the problem into a series of subbanddependent minimizations. In particular, this allows
unknown title
"... A fast thresholded Landweber algorithm for waveletregularized multidimensional deconvolution Cédric Vonesch, Student Member, IEEE and Michael Unser, Fellow, IEEE We present a fast variational deconvolution algorithm that minimizes a quadratic data term subject to a regularization on the ℓ 1norm of ..."
Abstract
 Add to MetaCart
of the wavelet coefficients of the solution. Previously available methods have essentially consisted in alternating between a Landweber iteration and a waveletdomain softthresholding operation. While having the advantage of simplicity, they are known to converge slowly. By expressing the cost functional in a
Results 1  10
of
188