Results 1  10
of
284
Using an ica representation of high dimensional data for object recognition and classication
 ISBN
, 2001
"... This paper applies a bayesian classification scheme to the problem of recognition through probabilistic modeling of high dimensional data. In this context, high dimensionality does not allow precision in the density estimation. We propose a local Independent Component Analysis (ICA) representation o ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
This paper applies a bayesian classification scheme to the problem of recognition through probabilistic modeling of high dimensional data. In this context, high dimensionality does not allow precision in the density estimation. We propose a local Independent Component Analysis (ICA) representation
Cepstrallike ica representations for textindependent speaker recognition
 in Proc. 4th Int. Conf. on ICA and BSS (ICA2003
, 2003
"... Automatic methods to determine voiceprints in speech samples predominantly use shorttime spectra to yield specific features of a given speaker. Among these, the Mel Frequency Cepstrum Coefficient (MFCC) features are widely used today. The speaker recognition method presented here is based on shor ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
introduce a speech synthesis model that can be identified using Independent Component Analysis (ICA). The ICA representations of log spectral data result in cepstrallike, independent coefficients, which capture correlations among frequency bands specific to the given speaker. It also results in speaker
Survey on Independent Component Analysis
 NEURAL COMPUTING SURVEYS
, 1999
"... A common problem encountered in such disciplines as statistics, data analysis, signal processing, and neural network research, is nding a suitable representation of multivariate data. For computational and conceptual simplicity, such a representation is often sought as a linear transformation of the ..."
Abstract

Cited by 2309 (104 self)
 Add to MetaCart
of the original data. Wellknown linear transformation methods include, for example, principal component analysis, factor analysis, and projection pursuit. A recently developed linear transformation method is independent component analysis (ICA), in which the desired representation is the one that minimizes
The "Independent Components" of Natural Scenes are Edge Filters
, 1997
"... It has previously been suggested that neurons with line and edge selectivities found in primary visual cortex of cats and monkeys form a sparse, distributed representation of natural scenes, and it has been reasoned that such responses should emerge from an unsupervised learning algorithm that attem ..."
Abstract

Cited by 617 (29 self)
 Add to MetaCart
It has previously been suggested that neurons with line and edge selectivities found in primary visual cortex of cats and monkeys form a sparse, distributed representation of natural scenes, and it has been reasoned that such responses should emerge from an unsupervised learning algorithm
Face recognition by independent component analysis
 IEEE Transactions on Neural Networks
, 2002
"... Abstract—A number of current face recognition algorithms use face representations found by unsupervised statistical methods. Typically these methods find a set of basis images and represent faces as a linear combination of those images. Principal component analysis (PCA) is a popular example of such ..."
Abstract

Cited by 348 (5 self)
 Add to MetaCart
produced a factorial face code. Both ICA representations were superior to representations based on PCA for recognizing faces across days and changes in expression. A classifier that combined the two ICA representations gave the best performance. Index Terms—Eigenfaces, face recognition, independent
Independent Component Representations for Face Recognition
"... In a task such as face recognition, much of the important information may be contained in the highorder relationships among the image pixels. A number of face recognition algorithms employ principal component analysis (PCA), which is based on the secondorder statistics of the image set, and does n ..."
Abstract

Cited by 136 (9 self)
 Add to MetaCart
not address highorder statistical dependencies such as the relationships among three or more pixels. Independent component analysis (ICA) is a generalization of PCA which separates the highorder moments of the input in addition to the secondorder moments. ICA was performed on a set of face images
Geometric ICA using Nonlinear Correlation And MDS
, 2003
"... We describe a method of visualising geometrically the dependency structure of a distributed representation. The mutual information between each pair of components is estimated using a nonlinear correlation coe# cient, in terms of which a distance measure is defined. Multidimensional scaling is then ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
is then used to generate a spatial configuration that reproduces these distances, the end result being a spatial representation of the dependency between the components, from which an appropriate topology for the representation may be inferred. The method is applied to ICA representations of speech and music.
STATISTICAL INFERENCE OF MISSING SPEECH DATA IN THE ICA DOMAIN
"... We address the problem of speech estimation as statistical estimation with “missing ” data in the independent component analysis (ICA) domain. Missing components are substituted by values drawn from “similar ” data in a multifaceted ICA representation of the complete data. The paper presents the al ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We address the problem of speech estimation as statistical estimation with “missing ” data in the independent component analysis (ICA) domain. Missing components are substituted by values drawn from “similar ” data in a multifaceted ICA representation of the complete data. The paper presents
Improving Naive Bayes using ClassConditional ICA
 In VIII Conferencia Iberoamericana de Inteligencia Artificial (IBERAMIA
, 2002
"... In the past years, Naive Bayes has experienced a renaissance in machine learning, particularly in the area of information retrieval. This classifier is based on the not always realistic assumption that classconditional distributions can be factorized in the product of their marginal densities. On t ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
. On the other side, one of the most common ways of estimating the Independent Component Analysis (ICA) representation for a given random vector consists in minimizing the KullbackLeibler distance between the joint density and the product of the marginal densities (mutual information). l'om this that ICA
Results 1  10
of
284