• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

DMCA

Logistic Regression, AdaBoost and Bregman Distances (2000)

Cached

  • Download as a PDF

Download Links

  • [l2r.cs.uiuc.edu]
  • [l2r.cs.uiuc.edu]
  • [l2r.cs.uiuc.edu]
  • [www-stat.wharton.upenn.edu]
  • [dnkweb.denken.or.jp]
  • [www.cs.huji.ac.il]
  • [www.research.att.com]
  • [www.csail.mit.edu]
  • [people.csail.mit.edu]
  • [www.learningtheory.org]
  • [www.recognition.mccme.ru]
  • [www.cs.huji.ac.il]

  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Michael Collins , Robert E. Schapire , Yoram Singer
Citations:258 - 44 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@MISC{Collins00logisticregression,,
    author = {Michael Collins and Robert E. Schapire and Yoram Singer},
    title = { Logistic Regression, AdaBoost and Bregman Distances},
    year = {2000}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

We give a unified account of boosting and logistic regression in which each learning problem is cast in terms of optimization of Bregman distances. The striking similarity of the two problems in this framework allows us to design and analyze algorithms for both simultaneously, and to easily adapt algorithms designed for one problem to the other. For both problems, we give new algorithms and explain their potential advantages over existing methods. These algorithms can be divided into two types based on whether the parameters are iteratively updated sequentially (one at a time) or in parallel (all at once). We also describe a parameterized family of algorithms which interpolates smoothly between these two extremes. For all of the algorithms, we give convergence proofs using a general formalization of the auxiliary-function proof technique. As one of our sequential-update algorithms is equivalent to AdaBoost, this provides the first general proof of convergence for AdaBoost. We show that all of our algorithms generalize easily to the multiclass case, and we contrast the new algorithms with iterative scaling. We conclude with a few experimental results with synthetic data that highlight the behavior of the old and newly proposed algorithms in different settings.

Keyphrases

bregman distance    logistic regression    new algorithm    convergence proof    synthetic data    general formalization    unified account    parameterized family    auxiliary-function proof technique    different setting    first general proof    potential advantage    learning problem    sequential-update algorithm    experimental result    striking similarity    multiclass case    iterative scaling   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University