• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Analyzing the Effectiveness and Applicability of Co-training (2000)

Cached

  • Download as a PDF

Download Links

  • [www.cs.cmu.edu]
  • [www.cs.cmu.edu]
  • [users.cs.dal.ca]
  • [www.cs.dal.ca]
  • [www.kamalnigam.com]
  • [www-poleia.lip6.fr]
  • [www-connex.lip6.fr]
  • [www.indiana.edu]
  • [faculty.washington.edu]
  • []
  • [www.accenture.com]
  • [www.accenture.com]

  • Other Repositories/Bibliography

  • DBLP
  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Kamal Nigam , Rayid Ghani
Citations:263 - 7 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@INPROCEEDINGS{Nigam00analyzingthe,
    author = {Kamal Nigam and Rayid Ghani},
    title = {Analyzing the Effectiveness and Applicability of Co-training},
    booktitle = {},
    year = {2000},
    pages = {86--93}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

Recently there has been significant interest in supervised learning algorithms that combine labeled and unlabeled data for text learning tasks. The co-training setting [1] applies to datasets that have a natural separation of their features into two disjoint sets. We demonstrate that when learning from labeled and unlabeled data, algorithms explicitly leveraging a natural independent split of the features outperform algorithms that do not. When a natural split does not exist, co-training algorithms that manufacture a feature split may out-perform algorithms not using a split. These results help explain why co-training algorithms are both discriminative in nature and robust to the assumptions of their embedded classifiers. Categories and Subject Descriptors I.2.6 [Artificial Intelligence]: Learning; H.3.3 [Information Storage and Retrieval]: Information Search and Retrieval--- Information Filtering Keywords co-training, expectation-maximization, learning with labeled and unlabeled...

Keyphrases

co-training algorithm    co-training setting    information search    significant interest    information storage    unlabeled data    natural split    text learning task    subject descriptor    feature split    embedded classifier    retrieval information filtering keywords co-training    natural independent split    supervised learning algorithm    natural separation    disjoint set    artificial intelligence   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University