• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

DMCA

Least angle regression (2004)

Cached

  • Download as a PDF

Download Links

  • [www-stat.stanford.edu]
  • [stat.stanford.edu]
  • [www.stat.purdue.edu]
  • [www-stat.stanford.edu]
  • [cbio.ensmp.fr]
  • [arxiv.org]
  • [arxiv.org]
  • [arxiv.org]
  • [www-stat.stanford.edu]
  • [www.stat.purdue.edu]
  • [www.stanford.edu]
  • [web.stanford.edu]
  • [www.stanford.edu]
  • [www.stat.ucla.edu]

  • Save to List
  • Add to Collection
  • Correct Errors
  • Monitor Changes
by Bradley Efron , Trevor Hastie , Iain Johnstone , Robert Tibshirani
Citations:1320 - 33 self
  • Summary
  • Citations
  • Active Bibliography
  • Co-citation
  • Clustered Documents
  • Version History

BibTeX

@MISC{Efron04leastangle,
    author = {Bradley Efron and Trevor Hastie and Iain Johnstone and Robert Tibshirani},
    title = {Least angle regression},
    year = {2004}
}

Share

Facebook Twitter Reddit Bibsonomy

OpenURL

 

Abstract

The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regression (LARS), a new model selection algorithm, is a useful and less greedy version of traditional forward selection methods. Three main properties are derived: (1) A simple modification of the LARS algorithm implements the Lasso, an attractive version of ordinary least squares that constrains the sum of the absolute regression coefficients; the LARS modification calculates all possible Lasso estimates for a given problem, using an order of magnitude less computer time than previous methods. (2) A different LARS modification efficiently implements Forward Stagewise linear regression, another promising new model selection method; this connection explains the similar numerical results previously observed for the Lasso and Stagewise, and helps us understand the properties of both methods, which are seen as constrained versions of the simpler LARS algorithm. (3) A simple approximation for the degrees of freedom of a LARS estimate is available, from which we derive a Cp estimate of prediction error; this allows a principled choice among the range of possible LARS estimates. LARS and its variants are computationally efficient: the paper describes a publicly available algorithm that requires only the same order of magnitude of computational effort as ordinary least squares applied to the full set of covariates.

Keyphrases

least angle regression    mathematical statistic    forward selection    similar numerical result    efficient prediction    available algorithm    principled choice    lars estimate    greedy version    different lars modification    simple approximation    attractive version    possible covariates    new model selection algorithm    computational effort    lars modification    full set    parsimonious set    cp estimate    computer time    linear model    lars algorithm    simple modification    absolute regression coefficient    constrained version    new model selection method    possible lars estimate    possible lasso estimate    previous method    traditional forward selection method    large collection    backward elimination    simpler lars algorithm    model selection algorithm    prediction error    forward stagewise linear regression    response variable    main property   

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University