Results 11  20
of
2,267
Separable Dictionary Learning
"... Many techniques in computer vision, machine learning, and statistics rely on the fact that a signal of interest admits a sparse representation over some dictionary. Dictionaries are either available analytically, or can be learned from a suitable training set. While analytic dictionaries permit to c ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
Many techniques in computer vision, machine learning, and statistics rely on the fact that a signal of interest admits a sparse representation over some dictionary. Dictionaries are either available analytically, or can be learned from a suitable training set. While analytic dictionaries permit
DICTIONARY LEARNING OF CONVOLVED SIGNALS
"... Assuming that a set of source signals is sparsely representable in a given dictionary, we show how their sparse recovery fails whenever we can only measure a convolved observation of them. Starting from this motivation, we develop a block coordinate descent method which aims to learn a convolved dic ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Assuming that a set of source signals is sparsely representable in a given dictionary, we show how their sparse recovery fails whenever we can only measure a convolved observation of them. Starting from this motivation, we develop a block coordinate descent method which aims to learn a convolved
On the Integration of Topic Modeling and Dictionary Learning
"... A new nonparametric Bayesian model is developed to integrate dictionary learning and topic model into a unified framework. The model is employed to analyze partially annotated images, with the dictionary learning performed directly on image patches. Efficient inference is performed with a Gibbsslice ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
A new nonparametric Bayesian model is developed to integrate dictionary learning and topic model into a unified framework. The model is employed to analyze partially annotated images, with the dictionary learning performed directly on image patches. Efficient inference is performed with a
A dictionary learning process . . .
"... Empirically, we find that, despite the classspecific features owned by the objects appearing in the images, the objects from different categories usually share some common patterns, which do not contribute to the discrimination of them. Concentrating on this observation and under the general dicti ..."
Abstract
 Add to MetaCart
dictionary learning (DL) framework, we propose a novel method to explicitly learn a common pattern pool (the commonality) and classspecific dictionaries (the particularity) for classification. We call our method DLCOPAR, which can learn the most compact and most discriminative classspecific dictionaries
Submodular Dictionary Learning for Sparse Coding
"... A greedybased approach to learn a compact and discriminative dictionary for sparse representation is presented. We propose an objective function consisting of two components: entropy rate of a random walk on a graph and a discriminative term. Dictionary learning is achieved by finding a graph topol ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
A greedybased approach to learn a compact and discriminative dictionary for sparse representation is presented. We propose an objective function consisting of two components: entropy rate of a random walk on a graph and a discriminative term. Dictionary learning is achieved by finding a graph
The sample complexity of dictionary learning

, 2011
"... A large set of signals can sometimes be described sparsely using a dictionary, that is, every element can be represented as a linear combination of few elements from the dictionary. Algorithms for various signal processing applications, including classification, denoising and signal separation, lear ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
, learn a dictionary from a given set of signals to be represented. Can we expect that the error in representing by such a dictionary a previously unseen signal from the same source will be of similar magnitude as those for the given examples? We assume signals are generated from a fixed distribution
F.: Proximal methods for sparse hierarchical dictionary learning
 In: ICML
"... This paper proposes to combine two approaches for modeling data admitting sparse representations: On the one hand, dictionary learning has proven very effective for various signal restoration and representation tasks. On the other hand, recent work on structured sparsity provides a natural framework ..."
Abstract

Cited by 124 (21 self)
 Add to MetaCart
This paper proposes to combine two approaches for modeling data admitting sparse representations: On the one hand, dictionary learning has proven very effective for various signal restoration and representation tasks. On the other hand, recent work on structured sparsity provides a natural
An Incidence Geometry approach to Dictionary Learning∗
"... We study the Dictionary Learning (aka Sparse Coding) problem of obtaining a sparse representation of data points, by learning dictionary vectors upon which the data points can be written as sparse linear combinations. We view this problem from a geometry perspective as the spanning set of a subspa ..."
Abstract
 Add to MetaCart
We study the Dictionary Learning (aka Sparse Coding) problem of obtaining a sparse representation of data points, by learning dictionary vectors upon which the data points can be written as sparse linear combinations. We view this problem from a geometry perspective as the spanning set of a
TASKDRIVEN DICTIONARY LEARNING FOR INPAINTING
"... Several approaches used for inpainting of images take advantage of sparse representations. Some of these seek to learn a dictionary that will adapt the sparse representation to the available data. A further refinement is to adapt the learning process to the task itself. In this paper, we formulate ..."
Abstract
 Add to MetaCart
Several approaches used for inpainting of images take advantage of sparse representations. Some of these seek to learn a dictionary that will adapt the sparse representation to the available data. A further refinement is to adapt the learning process to the task itself. In this paper, we formulate
Probabilistic models for supervised dictionary learning
 in CVPR
, 2010
"... Dictionary generation is a core technique of the bagofvisualwords (BOV) models when applied to image categorization. Most of previous approaches generate dictionaries by unsupervised clustering techniques, e.g. kmeans. However, the features obtained by such kind of dictionaries may not be optimal ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
not be optimal for image classification. In this paper, we propose a probabilistic model for supervised dictionary learning (SDLM) which seamlessly combines an unsupervised model (a Gaussian Mixture Model) and a supervised model (a logistic regression model) in a probabilistic framework.
Results 11  20
of
2,267