Results 1  10
of
3,310
Computing sparse multiples of polynomials
 In Proc. Internat. Symp. on Algorithms and Computation (ISAAC
, 2010
"... We consider the problem of finding a sparse multiple of a polynomial. Given f ∈ F[x] of degree d over a field F, and a desired sparsity t, our goal is to determine if there exists a multiple h ∈ F[x] of f such that h has at most t nonzero terms, and if so, to find such an h. When F = Q and t is con ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
We consider the problem of finding a sparse multiple of a polynomial. Given f ∈ F[x] of degree d over a field F, and a desired sparsity t, our goal is to determine if there exists a multiple h ∈ F[x] of f such that h has at most t nonzero terms, and if so, to find such an h. When F = Q
Robust face recognition via sparse representation
 IEEE TRANS. PATTERN ANALYSIS AND MACHINE INTELLIGENCE
, 2008
"... We consider the problem of automatically recognizing human faces from frontal views with varying expression and illumination, as well as occlusion and disguise. We cast the recognition problem as one of classifying among multiple linear regression models, and argue that new theory from sparse signa ..."
Abstract

Cited by 936 (40 self)
 Add to MetaCart
We consider the problem of automatically recognizing human faces from frontal views with varying expression and illumination, as well as occlusion and disguise. We cast the recognition problem as one of classifying among multiple linear regression models, and argue that new theory from sparse
Resistant Sparse Multiple Canonical Correlation
, 2014
"... Canonical Correlation Analysis (CCA) is a multivariate technique that takes two datasets and forms the most highly correlated possible pairs of linear combinations between them. Each subsequent pair of linear combinations is orthogonal to the preceding pair, meaning that new information is gleaned ..."
Abstract
 Add to MetaCart
from each pair. By looking at the magnitude of coefficient values, we can find out which variables can be grouped together, thus better understanding multiple interactions that are otherwise difficult to compute or grasp intuitively. CCA appears to have quite powerful applications to high throughput
Stable recovery of sparse overcomplete representations in the presence of noise
 IEEE TRANS. INFORM. THEORY
, 2006
"... Overcomplete representations are attracting interest in signal processing theory, particularly due to their potential to generate sparse representations of signals. However, in general, the problem of finding sparse representations must be unstable in the presence of noise. This paper establishes t ..."
Abstract

Cited by 460 (22 self)
 Add to MetaCart
that the overcomplete system is incoherent, it is shown that the optimally sparse approximation to the noisy data differs from the optimally sparse decomposition of the ideal noiseless signal by at most a constant multiple of the noise level. As this optimalsparsity method requires heavy (combinatorial) computational
From Sparse Regression to Sparse Multiple Correspondence Analysis
"... funded by the R&D department of Chanel cosmetic company • Industrial context and motivation: – Relate gene expression data to skin aging measures – n=500, p = 800 000 SNP’s, 15 000 genes ..."
Abstract
 Add to MetaCart
funded by the R&D department of Chanel cosmetic company • Industrial context and motivation: – Relate gene expression data to skin aging measures – n=500, p = 800 000 SNP’s, 15 000 genes
Nonsparse multiple kernel fisher discriminant analysis
 Journal of Machine Learning Research
"... Sparsityinducing multiple kernel Fisher discriminant analysis (MKFDA) has been studied in the literature. Building on recent advances in nonsparse multiple kernel learning (MKL), we propose a nonsparse version of MKFDA, which imposes a general `p norm regularisation on the kernel weights. We fo ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Sparsityinducing multiple kernel Fisher discriminant analysis (MKFDA) has been studied in the literature. Building on recent advances in nonsparse multiple kernel learning (MKL), we propose a nonsparse version of MKFDA, which imposes a general `p norm regularisation on the kernel weights. We
Parallel Numerical Linear Algebra
, 1993
"... We survey general techniques and open problems in numerical linear algebra on parallel architectures. We first discuss basic principles of parallel processing, describing the costs of basic operations on parallel machines, including general principles for constructing efficient algorithms. We illust ..."
Abstract

Cited by 773 (23 self)
 Add to MetaCart
illustrate these principles using current architectures and software systems, and by showing how one would implement matrix multiplication. Then, we present direct and iterative algorithms for solving linear systems of equations, linear least squares problems, the symmetric eigenvalue problem
M.: Multitask learning via nonsparse multiple kernel learning
 In: CAIP (2011), accepted
"... Abstract. In object classification tasks from digital photographs, multiple categories are considered for annotation. Some of these visual concepts may have semantic relations and can appear simultaneously in images. Although taxonomical relations and cooccurrence structures between object categori ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
categories have been studied, it is not easy to use such information to enhance performance of object classification. In this paper, we propose a novel multitask learning procedure which extracts useful information from the classifiers for the other categories. Our approach is based on nonsparse multiple
Algorithms for simultaneous sparse approximation. Part II: Convex relaxation
, 2004
"... Abstract. A simultaneous sparse approximation problem requests a good approximation of several input signals at once using different linear combinations of the same elementary signals. At the same time, the problem balances the error in approximation against the total number of elementary signals th ..."
Abstract

Cited by 366 (5 self)
 Add to MetaCart
Abstract. A simultaneous sparse approximation problem requests a good approximation of several input signals at once using different linear combinations of the same elementary signals. At the same time, the problem balances the error in approximation against the total number of elementary signals
Support vector machine learning for interdependent and structured output spaces
 In ICML
, 2004
"... Learning general functional dependencies is one of the main goals in machine learning. Recent progress in kernelbased methods has focused on designing flexible and powerful input representations. This paper addresses the complementary issue of problems involving complex outputs suchas multiple depe ..."
Abstract

Cited by 450 (20 self)
 Add to MetaCart
Learning general functional dependencies is one of the main goals in machine learning. Recent progress in kernelbased methods has focused on designing flexible and powerful input representations. This paper addresses the complementary issue of problems involving complex outputs suchas multiple
Results 1  10
of
3,310