• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

An inexact proximal path-following algorithm for constrained convex minimization (0)

by Q Tran-Dinh, A Kyrillidis, V Cevher
Add To MetaCart

Tools

Sorted by:
Results 1 - 3 of 3

Composite Self-Concordant Minimization

by Quoc Tran-dinh, Anastasios Kyrillidis, Volkan Cevher
"... We propose a variable metric framework for minimizing the sum of a self-concordant function and a possibly non-smooth convex function endowed with a computable proximal operator. We theoretically establish the convergence of our framework without relying on the usual Lipschitz gradient assumption on ..."
Abstract - Cited by 7 (5 self) - Add to MetaCart
We propose a variable metric framework for minimizing the sum of a self-concordant function and a possibly non-smooth convex function endowed with a computable proximal operator. We theoretically establish the convergence of our framework without relying on the usual Lipschitz gradient assumption on the smooth part. An important highlight of our work is a new set of analytic step-size selection and correction procedures based on the structure of the problem. We describe concrete algorithmic instances of our framework for several interesting large-scale applications and demonstrate them numerically on both synthetic and real data.

Scalable sparse covariance estimation via self-concordance

by Anastasios Kyrillidis, Rabeeh Karimi Mahabadi, Quoc Tran Dinh, Volkan Cevher
"... We consider the class of convex minimization prob-lems, composed of a self-concordant function, such as the log det metric, a convex data fidelity term h(·) and, a regularizing – possibly non-smooth – function g(·). This type of problems have recently attracted a great deal of interest, mainly due t ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
We consider the class of convex minimization prob-lems, composed of a self-concordant function, such as the log det metric, a convex data fidelity term h(·) and, a regularizing – possibly non-smooth – function g(·). This type of problems have recently attracted a great deal of interest, mainly due to their omnipres-ence in top-notch applications. Under this locally Lip-schitz continuous gradient setting, we analyze the con-vergence behavior of proximal Newton schemes with the added twist of a probable presence of inexact eval-uations. We prove attractive convergence rate guaran-tees and enhance state-of-the-art optimization schemes to accommodate such developments. Experimental re-sults on sparse covariance estimation show the merits of our algorithm, both in terms of recovery efficiency and complexity.

An optimal first-order primal-dual gap reduction framework for constrained convex optimization

by Quoc Tran-dinh, Volkan Cevher
"... ..."
Abstract - Add to MetaCart
Abstract not found
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University