• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Instance-based learning algorithms (1991)

Cached

  • Download as a PDF

Download Links

  • [www.cs.uwyo.edu]
  • [sce.uhcl.edu]
  • [nas.uhcl.edu]
  • [www.hlt.utdallas.edu]
  • [dns2.icar.cnr.it]
  • [www.hlt.utdallas.edu]
  • [www.dns2.icar.cnr.it]
  • [sci2s.ugr.es]
  • [150.214.190.154]
  • [alife.ccp14.ac.uk]
  • [dli.iiit.ac.in]
  • [ijcai.org]
  • [ijcai.org]

  • Other Repositories/Bibliography

  • DBLP
  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by David W. Aha , Dennis Kibler , Marc K. Albert
Venue:Machine Learning
Citations:1389 - 18 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@INPROCEEDINGS{Aha91instance-basedlearning,
    author = {David W. Aha and Dennis Kibler and Marc K. Albert},
    title = {Instance-based learning algorithms},
    booktitle = {Machine Learning},
    year = {1991},
    pages = {37--66}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

Abstract. Storing and using specific instances improves the performance of several supervised learning algorithms. These include algorithms that learn decision trees, classification rules, and distributed networks. However, no investigation has analyzed algorithms that use only specific instances to solve incremental learning tasks. In this paper, we describe a framework and methodology, called instance-based learning, that generates classification predictions using only specific instances. Instance-based learning algorithms do not maintain a set of abstractions derived from specific instances. This approach extends the nearest neighbor algorithm, which has large storage requirements. We describe how storage requirements can be significantly reduced with, at most, minor sacrifices in learning rate and classification accuracy. While the storage-reducing algorithm performs well on several realworld databases, its performance degrades rapidly with the level of attribute noise in training instances. Therefore, we extended it with a significance test to distinguish noisy instances. This extended algorithm's performance degrades gracefully with increasing noise levels and compares favorably with a noise-tolerant decision tree algorithm.

Keyphrases

instance-based learning algorithm    specific instance    performance degrades    incremental learning task    noise-tolerant decision tree algorithm    instance-based learning    decision tree    storage-reducing algorithm performs    classification rule    classification prediction    noise level    training instance    several realworld database    minor sacrifice    neighbor algorithm    significance test    extended algorithm    large storage requirement    noisy instance    storage requirement    classification accuracy    attribute noise   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University