• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations | Disambiguate

A greedy framework for first-order optimization (0)

by J Steinhardt, J Huggins
Add To MetaCart

Tools

Sorted by:
Results 1 - 1 of 1

Suykens, “Hybrid conditional gradient-smoothing algorithms with applications to sparse and low rank regularization

by A. Argyriou, M. Signoretto, J. Suykens - Regularization, Optimization, Kernels, and Support Vector Machines , 2014
"... Conditional gradient methods are old and well studied optimization algorithms. Their origin dates at least to the 50’s and the Frank-Wolfe algorithm for quadratic programming [18] but they apply to much more general optimization problems. General formulations of conditional gradient algorithms have ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
Conditional gradient methods are old and well studied optimization algorithms. Their origin dates at least to the 50’s and the Frank-Wolfe algorithm for quadratic programming [18] but they apply to much more general optimization problems. General formulations of conditional gradient algorithms have been studied in the
(Show Context)

Citation Context

...ample, it has been observed that conditional gradient methods are related to boosting, greedy methods for sparse problems [10, 51] and to orthogonal matching pursuit [28, 27]. Some very recent papers =-=[3, 53]-=- show an equivalence to the optimization method of mirror descent, which we discuss briefly in Section 3.2. One reason for the popularity and the revival of interest in conditional gradient methods ha...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University