• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Manifold regularization: A geometric framework for learning from labeled and unlabeled examples (2006)

Cached

  • Download as a PDF

Download Links

  • [www.cse.ohio-state.edu]
  • [people.cs.uchicago.edu]
  • [people.cs.uchicago.edu]
  • [www.jmlr.org]
  • [www.cs.uchicago.edu]
  • [people.cs.uchicago.edu]
  • [vikas.sindhwani.org]
  • [people.cs.uchicago.edu]
  • [people.cs.uchicago.edu]
  • [jmlr.org]
  • [webdocs.cs.ualberta.ca]
  • [people.cs.uchicago.edu]
  • [www.cs.uchicago.edu]
  • [people.cs.uchicago.edu]

  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Mikhail Belkin , Partha Niyogi , Vikas Sindhwani
Venue:JOURNAL OF MACHINE LEARNING RESEARCH
Citations:577 - 16 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@TECHREPORT{Belkin06manifoldregularization:,
    author = {Mikhail Belkin and Partha Niyogi and Vikas Sindhwani},
    title = {Manifold regularization: A geometric framework for learning from labeled and unlabeled examples},
    institution = {JOURNAL OF MACHINE LEARNING RESEARCH},
    year = {2006}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

We propose a family of learning algorithms based on a new form of regularization that allows us to exploit the geometry of the marginal distribution. We focus on a semi-supervised framework that incorporates labeled and unlabeled data in a general-purpose learner. Some transductive graph learning algorithms and standard methods including Support Vector Machines and Regularized Least Squares can be obtained as special cases. We utilize properties of Reproducing Kernel Hilbert spaces to prove new Representer theorems that provide theoretical basis for the algorithms. As a result (in contrast to purely graph-based approaches) we obtain a natural out-of-sample extension to novel examples and so are able to handle both transductive and truly semi-supervised settings. We present experimental evidence suggesting that our semi-supervised algorithms are able to use unlabeled data effectively. Finally we have a brief discussion of unsupervised and fully supervised learning within our general framework.

Keyphrases

manifold regularization    geometric framework    unlabeled example    general-purpose learner    semi-supervised framework    special case    new form    theoretical basis    regularized least square    general framework    graph-based approach    unlabeled data    present experimental evidence    semi-supervised setting    semi-supervised algorithm    brief discussion    transductive graph    utilize property    standard method    marginal distribution    natural out-of-sample extension    support vector machine    kernel hilbert space    new representer   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University