Results 1  10
of
685
Eigentaste: A Constant Time Collaborative Filtering Algorithm
, 2000
"... Eigentaste is a collaborative filtering algorithm that uses universal queries to elicit realvalued user ratings on a common set of items and applies principal component analysis (PCA) to the resulting dense subset of the ratings matrix. PCA facilitates dimensionality reduction for offline clusterin ..."
Abstract

Cited by 378 (6 self)
 Add to MetaCart
Eigentaste is a collaborative filtering algorithm that uses universal queries to elicit realvalued user ratings on a common set of items and applies principal component analysis (PCA) to the resulting dense subset of the ratings matrix. PCA facilitates dimensionality reduction for offline
EM Algorithms for PCA and SPCA
 in Advances in Neural Information Processing Systems
, 1998
"... I present an expectationmaximization (EM) algorithm for principal component analysis (PCA). The algorithm allows a few eigenvectors and eigenvalues to be extracted from large collections of high dimensional data. It is computationally very efficient in space and time. It also naturally accommodates ..."
Abstract

Cited by 146 (1 self)
 Add to MetaCart
I present an expectationmaximization (EM) algorithm for principal component analysis (PCA). The algorithm allows a few eigenvectors and eigenvalues to be extracted from large collections of high dimensional data. It is computationally very efficient in space and time. It also naturally
Distributed Clustering Using Collective Principal Component Analysis
 Knowledge and Information Systems
, 1999
"... This paper considers distributed clustering of high dimensional heterogeneous data using a distributed Principal Component Analysis (PCA) technique called the Collective PCA. It presents the Collective PCA technique that can be used independent of the clustering application. It shows a way to inte ..."
Abstract

Cited by 65 (9 self)
 Add to MetaCart
This paper considers distributed clustering of high dimensional heterogeneous data using a distributed Principal Component Analysis (PCA) technique called the Collective PCA. It presents the Collective PCA technique that can be used independent of the clustering application. It shows a way
EM Algorithms for PCA and Sensible PCA
 California Institute of Technology, Computation and Neural Systems
, 1997
"... I present an expectationmaximization (EM) algorithm for principal component analysis (PCA). The algorithm allows a few eigenvectors and eigenvalues to be extracted from large collections of high dimensional data. It is computationally ecient in space and time and does not require computing the samp ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
I present an expectationmaximization (EM) algorithm for principal component analysis (PCA). The algorithm allows a few eigenvectors and eigenvalues to be extracted from large collections of high dimensional data. It is computationally ecient in space and time and does not require computing
Learning methods for generic object recognition with invariance to pose and lighting
 In Proceedings of CVPR’04
, 2004
"... We assess the applicability of several popular learning methods for the problem of recognizing generic visual categories with invariance to pose, lighting, and surrounding clutter. A large dataset comprising stereo image pairs of 50 uniformcolored toys under 36 angles, 9 azimuths, and 6 lighting co ..."
Abstract

Cited by 253 (18 self)
 Add to MetaCart
conditions was collected (for a total of 194,400 individual images). The objects were 10 instances of 5 generic categories: fourlegged animals, human figures, airplanes, trucks, and cars. Five instances of each category were used for training, and the other five for testing. Lowresolution grayscale images
Nonnegative Sparse PCA
 In Neural Information Processing Systems
, 2007
"... We describe a nonnegative variant of the ”Sparse PCA ” problem. The goal is to create a low dimensional representation from a collection of points which on the one hand maximizes the variance of the projected points and on the other uses only parts of the original coordinates, and thereby creating a ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
We describe a nonnegative variant of the ”Sparse PCA ” problem. The goal is to create a low dimensional representation from a collection of points which on the one hand maximizes the variance of the projected points and on the other uses only parts of the original coordinates, and thereby creating
PCA = Gabor for Expression Recognition
 UCSD CSE TR CS629
, 1999
"... We show that Gabor filter representations of facial images give quantitatively indistinguishable results for classification of facial expressions as local PCA representations, in contrast to other recent work. We then show that a linear discriminant analysis performed on the Gabor filter representat ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
We show that Gabor filter representations of facial images give quantitatively indistinguishable results for classification of facial expressions as local PCA representations, in contrast to other recent work. We then show that a linear discriminant analysis performed on the Gabor filter
PCA Gaussianization for image processing
 in Proceedings of ICIP09
, 2009
"... The estimation of highdimensional probability density functions (PDFs) is not an easy task for many image processing applications. The linear models assumed by widely used transforms are often quite restrictive to describe the PDF of natural images. In fact, additional nonlinear processing is nee ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
is needed to overcome the limitations of the model. On the contrary, the class of techniques collectively known as projection pursuit, which solve the highdimensional problem by sequential univariate solutions, may be applied to very general PDFs (e.g. iterative Gaussianization procedures). However
Generalized principal component analysis (GPCA)
 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
, 2003
"... This paper presents an algebrogeometric solution to the problem of segmenting an unknown number of subspaces of unknown and varying dimensions from sample data points. We represent the subspaces with a set of homogeneous polynomials whose degree is the number of subspaces and whose derivatives at a ..."
Abstract

Cited by 206 (36 self)
 Add to MetaCart
the data set by minimizing certain distance function, thus dealing automatically with moderate noise in the data. A basis for the complement of each subspace is then recovered by applying standard PCA to the collection of derivatives (normal vectors). Extensions of GPCA that deal with data in a
Using Discrete PCA on Web Pages
 IN WORKSHOP ON STATISTICAL APPROACHES TO WEB MINING, SAWM’04, 2004. AT ECML
, 2004
"... Discrete PCA builds components for discrete data rather like PCA and ICA does for real data. The method has a long history and is most commonly used in genetics. Recent insights into the method are described here, and some examples of given of its use in automatically building a topic model for ..."
Abstract

Cited by 10 (6 self)
 Add to MetaCart
Discrete PCA builds components for discrete data rather like PCA and ICA does for real data. The method has a long history and is most commonly used in genetics. Recent insights into the method are described here, and some examples of given of its use in automatically building a topic model
Results 1  10
of
685