• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations | Disambiguate

DMCA

Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering (2004)

Cached

  • Download as a PDF

Download Links

  • [www.iro.umontreal.ca]
  • [research.microsoft.com]
  • [research.microsoft.com]
  • [www.iro.umontreal.ca]
  • [www.iro.umontreal.ca]
  • [books.nips.cc]
  • [www.iro.umontreal.ca]
  • [mountains.ece.umn.edu]
  • [lasa.epfl.ch]
  • [www.cmap.polytechnique.fr]
  • [www.iro.umontreal.ca]
  • [www.iro.umontreal.ca]
  • [iie.fing.edu.uy]
  • [www.iipl.fudan.edu.cn]
  • [www.cmap.polytechnique.fr]
  • [www.iro.umontreal.ca]
  • [books.nips.cc]
  • [nicolas.le-roux.name]
  • [books.nips.cc]

  • Other Repositories/Bibliography

  • DBLP
  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Yoshua Bengio , Jean-François Paiement , Pascal Vincent , Olivier Delalleau , Nicolas Le Roux , Marie Ouimet
Venue:In Advances in Neural Information Processing Systems
Citations:139 - 3 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@INPROCEEDINGS{Bengio04out-of-sampleextensions,
    author = {Yoshua Bengio and Jean-François Paiement and Pascal Vincent and Olivier Delalleau and Nicolas Le Roux and Marie Ouimet},
    title = {Out-of-Sample Extensions for LLE, Isomap, MDS, Eigenmaps, and Spectral Clustering},
    booktitle = {In Advances in Neural Information Processing Systems},
    year = {2004},
    pages = {177--184},
    publisher = {MIT Press}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

Several unsupervised learning algorithms based on an eigendecomposition provide either an embedding or a clustering only for given training points, with no straightforward extension for out-of-sample examples short of recomputing eigenvectors. This paper provides a unified framework for extending Local Linear Embedding (LLE), Isomap, Laplacian Eigenmaps, Multi-Dimensional Scaling (for dimensionality reduction) as well as for Spectral Clustering. This framework is based on seeing these algorithms as learning eigenfunctions of a data-dependent kernel.

Keyphrases

spectral clustering    out-of-sample extension    straightforward extension    multi-dimensional scaling    out-of-sample example    laplacian eigenmaps    learning eigenfunctions    eigendecomposition provide    unified framework    data-dependent kernel    training point    dimensionality reduction    local linear embedding   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University