Results 1  10
of
2,238
Learning over Sets using Kernel Principal Angles
 Journal of Machine Learning Research
, 2003
"... We consider the problem of learning with instances defined over a space of sets of vectors. We derive a new positive definite kernel f (A,B) defined over pairs of matrices A,B based on the concept of principal angles between two linear subspaces. We show that the principal angles can be recovered ..."
Abstract

Cited by 106 (2 self)
 Add to MetaCart
We consider the problem of learning with instances defined over a space of sets of vectors. We derive a new positive definite kernel f (A,B) defined over pairs of matrices A,B based on the concept of principal angles between two linear subspaces. We show that the principal angles can
Kernel Principal Angles for Classification Machines with Applications to Image Sequence Interpretation
, 2002
"... We consider the problem of learning with instances defined over a space of sets of vectors. We derive a new positive definite kernel f(A# B) defined over pairs of matrices A# B based on the concept of principal angles between two linear subspaces. We show that the principal angles can be recovered ..."
Abstract

Cited by 48 (6 self)
 Add to MetaCart
We consider the problem of learning with instances defined over a space of sets of vectors. We derive a new positive definite kernel f(A# B) defined over pairs of matrices A# B based on the concept of principal angles between two linear subspaces. We show that the principal angles can be recovered
Nonlinear component analysis as a kernel eigenvalue problem

, 1996
"... We describe a new method for performing a nonlinear form of Principal Component Analysis. By the use of integral operator kernel functions, we can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space of all ..."
Abstract

Cited by 1573 (83 self)
 Add to MetaCart
We describe a new method for performing a nonlinear form of Principal Component Analysis. By the use of integral operator kernel functions, we can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space of all
The nas parallel benchmarks
 The International Journal of Supercomputer Applications
, 1991
"... A new set of benchmarks has been developed for the performance evaluation of highly parallel supercomputers. These benchmarks consist of ve \parallel kernel " benchmarks and three \simulated application" benchmarks. Together they mimic the computation and data movement characterist ..."
Abstract

Cited by 694 (9 self)
 Add to MetaCart
A new set of benchmarks has been developed for the performance evaluation of highly parallel supercomputers. These benchmarks consist of ve \parallel kernel " benchmarks and three \simulated application" benchmarks. Together they mimic the computation and data movement
Kernel principal component analysis
 ADVANCES IN KERNEL METHODS  SUPPORT VECTOR LEARNING
, 1999
"... A new method for performing a nonlinear form of Principal Component Analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space of all ..."
Abstract

Cited by 274 (7 self)
 Add to MetaCart
A new method for performing a nonlinear form of Principal Component Analysis is proposed. By the use of integral operator kernel functions, one can efficiently compute principal components in highdimensional feature spaces, related to input space by some nonlinear map; for instance the space
Probabilistic nonlinear principal component analysis with Gaussian process latent variable models
 Journal of Machine Learning Research
, 2005
"... Summarising a high dimensional data set with a low dimensional embedding is a standard approach for exploring its structure. In this paper we provide an overview of some existing techniques for discovering such embeddings. We then introduce a novel probabilistic interpretation of principal component ..."
Abstract

Cited by 229 (24 self)
 Add to MetaCart
Summarising a high dimensional data set with a low dimensional embedding is a standard approach for exploring its structure. In this paper we provide an overview of some existing techniques for discovering such embeddings. We then introduce a novel probabilistic interpretation of principal
Geometric diffusions as a tool for harmonic analysis and structure definition of data: Diffusion maps
 Proceedings of the National Academy of Sciences
, 2005
"... of contexts of data analysis, such as spectral graph theory, manifold learning, nonlinear principal components and kernel methods. We augment these approaches by showing that the diffusion distance is a key intrinsic geometric quantity linking spectral theory of the Markov process, Laplace operators ..."
Abstract

Cited by 257 (45 self)
 Add to MetaCart
of contexts of data analysis, such as spectral graph theory, manifold learning, nonlinear principal components and kernel methods. We augment these approaches by showing that the diffusion distance is a key intrinsic geometric quantity linking spectral theory of the Markov process, Laplace
Kernel PCA and DeNoising in Feature Spaces
 ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 11
, 1999
"... Kernel PCA as a nonlinear feature extractor has proven powerful as a preprocessing step for classification algorithms. But it can also be considered as a natural generalization of linear principal component analysis. This gives rise to the question how to use nonlinear features for data compress ..."
Abstract

Cited by 170 (15 self)
 Add to MetaCart
Kernel PCA as a nonlinear feature extractor has proven powerful as a preprocessing step for classification algorithms. But it can also be considered as a natural generalization of linear principal component analysis. This gives rise to the question how to use nonlinear features for data
Kernel partial least squares regression in reproducing kernel Hilbert space
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2001
"... A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the late ..."
Abstract

Cited by 154 (10 self)
 Add to MetaCart
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables
Numerical methods for computing angles between linear subspaces
, 1971
"... Assume that two subspaces F and G of a unitary space are defined.. as the ranges(or nullspacd of given rectangular matrices A and B. Accurate numerical methods are developed for computing the principal angles ek(F,G) and orthogonal sets of principal vectors u k 6 F and vk c G, k = 1,2,..., q = d ..."
Abstract

Cited by 164 (4 self)
 Add to MetaCart
Assume that two subspaces F and G of a unitary space are defined.. as the ranges(or nullspacd of given rectangular matrices A and B. Accurate numerical methods are developed for computing the principal angles ek(F,G) and orthogonal sets of principal vectors u k 6 F and vk c G, k = 1,2,..., q
Results 1  10
of
2,238