Results 1  10
of
177,416
Jointly Sparse Vectors
, 2008
"... The rapid developing area of compressed sensing suggests that a sparse vector lying in an arbitrary high dimensional space can be accurately recovered from only a small set of nonadaptive linear measurements. Under appropriate conditions on the measurement matrix, the entire information about the o ..."
Abstract
 Add to MetaCart
The rapid developing area of compressed sensing suggests that a sparse vector lying in an arbitrary high dimensional space can be accurately recovered from only a small set of nonadaptive linear measurements. Under appropriate conditions on the measurement matrix, the entire information about
Compile time sparse vectors in C++
, 1997
"... Templates are a powerful feature of C++. In this article a template library for a special class of sparse vectors is outlined. For these vectors, the sparseness structure of the vectors can be arbitrary but must be known at compile time. In this case it suffices to store only the nonzero elements of ..."
Abstract
 Add to MetaCart
Templates are a powerful feature of C++. In this article a template library for a special class of sparse vectors is outlined. For these vectors, the sparseness structure of the vectors can be arbitrary but must be known at compile time. In this case it suffices to store only the nonzero elements
Sparse Vector Autoregressive Modeling
, 2012
"... The vector autoregressive (VAR) model has been widely used for modeling temporal dependence in a multivariate time series. For large (and even moderate) dimensions, the number of AR coefficients can be prohibitively large, resulting in noisy estimates, unstable predictions and difficulttointerpre ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
The vector autoregressive (VAR) model has been widely used for modeling temporal dependence in a multivariate time series. For large (and even moderate) dimensions, the number of AR coefficients can be prohibitively large, resulting in noisy estimates, unstable predictions and difficult
Efficient recovery of jointly sparse vectors
 In NIPS
"... We consider the reconstruction of sparse signals in the multiple measurement vector (MMV) model, in which the signal, represented as a matrix, consists of a set of jointly sparse vectors. MMV is an extension of the single measurement vector (SMV) model employed in standard compressive sensing (CS). ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
We consider the reconstruction of sparse signals in the multiple measurement vector (MMV) model, in which the signal, represented as a matrix, consists of a set of jointly sparse vectors. MMV is an extension of the single measurement vector (SMV) model employed in standard compressive sensing (CS
Sparse Bayesian Learning and the Relevance Vector Machine
, 2001
"... This paper introduces a general Bayesian framework for obtaining sparse solutions to regression and classification tasks utilising models linear in the parameters. Although this framework is fully general, we illustrate our approach with a particular specialisation that we denote the `relevance vect ..."
Abstract

Cited by 968 (5 self)
 Add to MetaCart
This paper introduces a general Bayesian framework for obtaining sparse solutions to regression and classification tasks utilising models linear in the parameters. Although this framework is fully general, we illustrate our approach with a particular specialisation that we denote the `relevance
Reduce and Boost: Recovering Arbitrary Sets of Jointly Sparse Vectors
, 2008
"... The rapid developing area of compressed sensing suggests that a sparse vector lying in a high dimensional space can be accurately and efficiently recovered from only a small set of nonadaptive linear measurements, under appropriate conditions on the measurement matrix. The vector model has been ext ..."
Abstract

Cited by 100 (41 self)
 Add to MetaCart
The rapid developing area of compressed sensing suggests that a sparse vector lying in a high dimensional space can be accurately and efficiently recovered from only a small set of nonadaptive linear measurements, under appropriate conditions on the measurement matrix. The vector model has been
THE REMBO ALGORITHM: ACCELERATED RECOVERY OF JOINTLY SPARSE VECTORS
"... We address the problem of recovering a sparse solution of a linear underdetermined system. Two variants of this problem are studied in the literature. One is the case of a sparse vector with only a few nonzero entries, and the other is of a sparse matrix with few rows nonidentically zero. In eit ..."
Abstract
 Add to MetaCart
We address the problem of recovering a sparse solution of a linear underdetermined system. Two variants of this problem are studied in the literature. One is the case of a sparse vector with only a few nonzero entries, and the other is of a sparse matrix with few rows nonidentically zero
Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ¹ minimization
 PROC. NATL ACAD. SCI. USA 100 2197–202
, 2002
"... Given a ‘dictionary’ D = {dk} of vectors dk, we seek to represent a signal S as a linear combination S = ∑ k γ(k)dk, with scalar coefficients γ(k). In particular, we aim for the sparsest representation possible. In general, this requires a combinatorial optimization process. Previous work considered ..."
Abstract

Cited by 632 (38 self)
 Add to MetaCart
Given a ‘dictionary’ D = {dk} of vectors dk, we seek to represent a signal S as a linear combination S = ∑ k γ(k)dk, with scalar coefficients γ(k). In particular, we aim for the sparsest representation possible. In general, this requires a combinatorial optimization process. Previous work
Linear spatial pyramid matching using sparse coding for image classification
 in IEEE Conference on Computer Vision and Pattern Recognition(CVPR
, 2009
"... Recently SVMs using spatial pyramid matching (SPM) kernel have been highly successful in image classification. Despite its popularity, these nonlinear SVMs have a complexity O(n 2 ∼ n 3) in training and O(n) in testing, where n is the training size, implying that it is nontrivial to scaleup the algo ..."
Abstract

Cited by 496 (20 self)
 Add to MetaCart
the algorithms to handle more than thousands of training images. In this paper we develop an extension of the SPM method, by generalizing vector quantization to sparse coding followed by multiscale spatial max pooling, and propose a linear SPM kernel based on SIFT sparse codes. This new approach remarkably
Results 1  10
of
177,416