Results 1  10
of
11
Fisher Discrimination Dictionary Learning for Sparse Representation
"... Sparse representation based classification has led to interesting image recognition results, while the dictionary used for sparse coding plays a key role in it. This paper presents a novel dictionary learning (DL) method to improve the pattern classification performance. Based on the Fisher discrimi ..."
Abstract

Cited by 73 (9 self)
 Add to MetaCart
Sparse representation based classification has led to interesting image recognition results, while the dictionary used for sparse coding plays a key role in it. This paper presents a novel dictionary learning (DL) method to improve the pattern classification performance. Based on the Fisher discrimination criterion, a structured dictionary, whose dictionary atoms have correspondence to the class labels, is learned so that the reconstruction error after sparse coding can be used for pattern classification. Meanwhile, the Fisher discrimination criterion is imposed on the coding coefficients so that they have small withinclass scatter but big betweenclass scatter. A new classification scheme associated with the proposed Fisher discrimination DL (FDDL) method is then presented by using both the discriminative information in the reconstruction error and sparse coding coefficients. The proposed FDDL is extensively evaluated on benchmark image databases in comparison with existing sparse representation and DL based classification methods.
Solving Structured Sparsity Regularization with Proximal Methods
 Proc. of ECML
, 2010
"... Abstract. Proximal methods have recently been shown to provide effective optimization procedures to solve the variational problems defining the!1 regularization algorithms. The goal of the paper is twofold. First we discuss how proximal methods can be applied to solve a large class of machine lear ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
(Show Context)
Abstract. Proximal methods have recently been shown to provide effective optimization procedures to solve the variational problems defining the!1 regularization algorithms. The goal of the paper is twofold. First we discuss how proximal methods can be applied to solve a large class of machine learning algorithms which can be seen as extensions of!1 regularization, namely structured sparsity regularization. For all these algorithms, it is possible to derive an optimization procedure which corresponds to an iterative projection algorithm. Second, we discuss the effect of a preconditioning of the optimization procedure achieved by adding a strictly convex functional to the objective function. Structured sparsity algorithms are usually based on minimizing a convex (not strictly convex) objective function and this might lead to undesired unstable behavior. We show that by perturbing the objective function by a small strictly convex term we often reduce substantially the number of required computations without affecting the prediction performance of the obtained solution. 1
LEARNING DISCRIMINATIVE DICTIONARIES WITH PARTIALLY LABELED DATA
"... While recent techniques for discriminative dictionary learning have demonstrated tremendous success in image analysis applications, their performance is often limited by the amount of labeled data available for training. Even though labeling images is difficult, it is relatively easy to collect unla ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
While recent techniques for discriminative dictionary learning have demonstrated tremendous success in image analysis applications, their performance is often limited by the amount of labeled data available for training. Even though labeling images is difficult, it is relatively easy to collect unlabeled images either by querying the web or from public datasets. In this paper, we propose a discriminative dictionary learning technique which utilizes both labeled and unlabeled data for learning dictionaries. Extensive evaluation on existing datasets demonstrate that the proposed method performs significantly better than state of the art dictionary learning approaches when unlabeled images are available for training. Index Terms — Semisupervised dictionary learning, latent variables, classification. 1.
A Regularization Approach to Nonlinear Variable Selection
"... In this paper we consider a regularization approach to variable selection when the regression function depends nonlinearly on a few input variables. The proposed method is based on a regularized least square estimator penalizing large values of the partial derivatives. An efficient iterative procedu ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
In this paper we consider a regularization approach to variable selection when the regression function depends nonlinearly on a few input variables. The proposed method is based on a regularized least square estimator penalizing large values of the partial derivatives. An efficient iterative procedure is proposed to solve the underlying variational problem, and its convergence is proved. The empirical properties of the obtained estimator are tested both for prediction and variable selection. The algorithm compares favorably to more standard ridge regression and ℓ1 regularization schemes. 1
S.: PADDLE: proximal algorithm for dual dictionaries learning
 In: Artificial neural networks and machine learning C ICANN 2011. Lecture notes in computer science
, 2011
"... Abstract. Recently, considerable research efforts have been devoted to the design of methods to learn from data overcomplete dictionaries for sparse coding. However, learned dictionaries require the solution of a sparse approximation problem for coding new data. In order to overcome this drawback, ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Recently, considerable research efforts have been devoted to the design of methods to learn from data overcomplete dictionaries for sparse coding. However, learned dictionaries require the solution of a sparse approximation problem for coding new data. In order to overcome this drawback, we propose an algorithm aimed at learning both a dictionary and its dual: a linear mapping directly performing the coding. Our algorithm is based on proximal methods and jointly minimizes the reconstruction error of the dictionary and the coding error of its dual; the sparsity of the representation is induced by an `1based penalty on its coefficients. Experimental results show that the algorithm is capable of recovering the expected dictionaries. Furthermore, on a benchmark dataset the image features obtained from the dual matrix yield stateoftheart classification performance while being much less computational intensive. 1
unknown title
"... Dictionary learning (DL) for sparse coding has shown promising results in classification tasks, while how to adaptively build the relationship between dictionary atoms and class labels is still an important open question. The existing dictionary learning approaches simply fix a dictionary atom to be ..."
Abstract
 Add to MetaCart
(Show Context)
Dictionary learning (DL) for sparse coding has shown promising results in classification tasks, while how to adaptively build the relationship between dictionary atoms and class labels is still an important open question. The existing dictionary learning approaches simply fix a dictionary atom to be either classspecific or shared by all classes beforehand, ignoring that the relationship needs to be updated during DL. To address this issue, in this paper we propose a novel latent dictionary learning (LDL) method to learn a discriminative dictionary and build its relationship to class labels adaptively. Each dictionary atom is jointly learned with a latent vector, which associates this atom to the representation of different classes. More specifically, we introduce a latent representation model, in which discrimination of the learned dictionary is exploited via minimizing the withinclass scatter of coding coefficients and the latentvalue weighted dictionary coherence. The optimal solution is efficiently obtained by the proposed solving algorithm. Correspondingly, a latent sparse representation based classifier is also presented. Experimental results demonstrate that our algorithm outperforms many recently proposed sparse representation and dictionary learning approaches for action, gender and face recognition. 1.
Sparse Representation based Fisher . . .
"... The employed dictionary plays an important role in sparse representation or sparse coding based image reconstruction and classification, while learning dictionaries from the training data has led to stateoftheart results in image classification tasks. However, many dictionary learning models exp ..."
Abstract
 Add to MetaCart
The employed dictionary plays an important role in sparse representation or sparse coding based image reconstruction and classification, while learning dictionaries from the training data has led to stateoftheart results in image classification tasks. However, many dictionary learning models exploit only the discriminative information in either the representation coefficients or the representation residual, which limits their performance. In this paper we present a novel dictionary learning method based on the Fisher discrimination criterion. A structured dictionary, whose atoms have correspondences to the subject class labels, is learned, with which not only the representation residual can be used to distinguish different classes, but also the representation coefficients have small withinclass scatter and big betweenclass scatter. The classification scheme associated with the proposed Fisher discrimination dictionary learning (FDDL) model is consequently presented by exploiting the discriminative information in both the representation residual and the representation coefficients. The proposed FDDL model is extensively evaluated on various image datasets, and it shows superior performance to many stateoftheart dictionary learning methods in a variety of classification tasks.
NonLinear Dictionary Learning with Partially Labeled Data
, 2013
"... While recent techniques for discriminative dictionary learning have demonstrated tremendous success in image analysis applications, their performance is often limited by the amount of labeled data available for training. Even though labeling images is difficult, it is relatively easy to collect unl ..."
Abstract
 Add to MetaCart
(Show Context)
While recent techniques for discriminative dictionary learning have demonstrated tremendous success in image analysis applications, their performance is often limited by the amount of labeled data available for training. Even though labeling images is difficult, it is relatively easy to collect unlabeled images either by querying the web or from public datasets. Using the kernel method, we propose a nonlinear discriminative dictionary learning technique which utilizes both labeled and unlabeled data for learning dictionaries in the highdimensional feature space. Furthermore, we show how this method can be extended for ambiguously labeled classification problem where each training sample has multiple labels and only one of them is correct. Extensive evaluation on existing datasets demonstrate that the proposed method performs significantly better than state of the art dictionary learning approaches when unlabeled images are available for training.
Max Residual Classifier
"... We introduce a novel classifier, called max residual classifier (MRC), for learning a sparse representation jointly with a discriminative decision function. MRC seeks to maximize the differences between the residual errors of the wrong classes and the right one. This effectively leads to a more dis ..."
Abstract
 Add to MetaCart
(Show Context)
We introduce a novel classifier, called max residual classifier (MRC), for learning a sparse representation jointly with a discriminative decision function. MRC seeks to maximize the differences between the residual errors of the wrong classes and the right one. This effectively leads to a more discriminative sparse representation and better classification accuracy. The optimization procedure is simple and efficient. Its objective function is closely related to the decision function of the residual classification strategy. Unlike existing methods for learning discriminative sparse representation that are restricted to a linear model, our approach is able to work with a nonlinear model via the use of Mercer kernel. Experimental results show that MRC is able to capture meaningful and compact structures of data. Its performances compare favourably with the current state of the art on challenging benchmarks including rotated MNIST,