• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

Smoothing and worst-case complexity for direct-search methods in nonsmooth optimization (0)

by R Garmanjani, L N Vicente
Venue:IMA J. Numer. Anal
Add To MetaCart

Tools

Sorted by:
Results 1 - 8 of 8

Worst case complexity of direct search

by L. N. Vicente , 2010
"... In this paper we prove that direct search of directional type shares the worst case complexity bound of steepest descent when sufficient decrease is imposed using a quadratic function of the step size parameter. This result is proved under smoothness of the objective function and using a framework o ..."
Abstract - Cited by 33 (4 self) - Add to MetaCart
In this paper we prove that direct search of directional type shares the worst case complexity bound of steepest descent when sufficient decrease is imposed using a quadratic function of the step size parameter. This result is proved under smoothness of the objective function and using a framework of the type of GSS (generating set search). We also discuss the worst case complexity of direct search when only simple decrease is imposed and when the objective function is nonsmooth.
(Show Context)

Citation Context

...ed to smooth functions. Deviation from smoothness (see [2, 18]) poses several difficulties to the derivation of a worst case complexity bound, and such a study will be the subject of a separate paper =-=[9]-=-. It should be pointed out that the results of this paper can be extended to bound and linear constraints, where the number of positive generators of the tangent cones of the nearly active constraints...

STOCHASTIC FIRST- AND ZEROTH-ORDER METHODS FOR NONCONVEX STOCHASTIC PROGRAMMING

by Saeed Ghadimi, Guanghui Lan , 2013
"... ..."
Abstract - Cited by 14 (3 self) - Add to MetaCart
Abstract not found

Penalty Methods with Stochastic Approximation for Stochastic Nonlinear Programming

by Xiao Wang, Shiqian Ma, Ya-xiang Yuan , 2013
"... ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...and explored their function-evaluation worst-case complexity. Both methods need to take at most O(ǫ−2) function-evaluations to reduce a first-order criticality measure below ǫ. Garmanjani and Vicente =-=[13]-=- proposed a smoothing direct-search method for nonsmooth nonconvex but Lipschitzian continuous unconstrained optimization. They showed that the method takes at most O(ǫ−3 log ǫ−1) function-evaluations...

A Derivative-Free CoMirror Algorithm for Convex Optimization

by Heinz H. Bauschke, et al. , 2014
"... We consider the minimization of a nonsmooth convex function over a compact convex set subject to a nonsmooth convex constraint. We work in the setting of derivative-free optimization (DFO), assuming that the objective and constraint functions are available through a black-box that provides function ..."
Abstract - Add to MetaCart
We consider the minimization of a nonsmooth convex function over a compact convex set subject to a nonsmooth convex constraint. We work in the setting of derivative-free optimization (DFO), assuming that the objective and constraint functions are available through a black-box that provides function values for lower-C2 representation of the functions. Our approach is based on a DFO adaptation of the -comirror algorithm [6]. Algorithmic convergence hinges on the ability to accurately approximate subgradients of lower-C2 functions, which we prove is possible through linear interpolation. We show that, if the sampling radii for linear interpolation are properly selected, then the new algorithm has the same convergence rate as the original gradient-based algorithm. This provides a novel global rate-of-convergence result for nonsmooth convex DFO with nonsmooth convex constraints. We conclude with numerical testing that demonstrates the practical feasibility of the algorithm and some directions for further research.

Worst Case Complexity of Direct Search under Convexity

by unknown authors , 2013
"... In this paper we prove that the broad class of direct-search methods of directional type, based on imposing sufficient decrease to accept new iterates, exhibits the same global rate or worst case complexity bound of the gradient method for the unconstrained minimization of a convex and smooth functi ..."
Abstract - Add to MetaCart
In this paper we prove that the broad class of direct-search methods of directional type, based on imposing sufficient decrease to accept new iterates, exhibits the same global rate or worst case complexity bound of the gradient method for the unconstrained minimization of a convex and smooth function. More precisely, it will be shown that the number of iterations needed to reduce the norm of the gradient of the objective function below a certain threshold is at most proportional to the inverse of the threshold. Our result is slightly less general than Nesterov’s for the gradient method, in the sense that we require more than just convexity of the objective function and boundedness of the initial iterate to the solution set. Our additional condition can, however, be satisfied in several scenarios, such as strong or uniform convexity, boundedness of the initial level set, or boundedness of the distance from the initial contour set to the solution set. It is a mild price to pay for deriving such a global rate for zero-order methods.
(Show Context)

Citation Context

...n 2 ɛ 3/2 ) for their adaptive cubic overestimation algorithm when using finite differences to approximate derivatives. In the non-smooth case, using smoothing techniques, both Garmanjani and Vicente =-=[7]-=- and Nesterov [11], established a global rate of approximately O(ɛ −3 ) iterations (and O(n 3 ɛ −3 ) function evaluations) for their for zero-order methods, where the threshold ɛ refers now to the gra...

Pre-Publicac~oes do Departamento de Matematica Universidade de Coimbra Preprint Number 15{27 A SECOND-ORDER GLOBALLY CONVERGENT DIRECT-SEARCH METHOD AND ITS WORST-CASE COMPLEXITY

by S. Gratton, C. W. Royer, L. N. Vicente
"... Abstract: Direct-search algorithms form one of the main classes of algorithms for smooth unconstrained derivative-free optimization, due to their simplicity and their well-established convergence results. They proceed by iteratively looking for improvement along some vectors or directions. In the pr ..."
Abstract - Add to MetaCart
Abstract: Direct-search algorithms form one of the main classes of algorithms for smooth unconstrained derivative-free optimization, due to their simplicity and their well-established convergence results. They proceed by iteratively looking for improvement along some vectors or directions. In the presence of smoothness, rst-order global convergence comes from the ability of the vectors to approximate the steepest descent direction, which can be quantied by a rst-order criticality (cosine) measure. The use of a set of vectors with a positive cosine measure together with the imposition of a sufficient decrease condition to accept new iterates leads to a convergence result as well as a worst-case complexity bound. In this paper, we present a second-order study of a general class of direct-search methods. We start by proving a weak second-order convergence result related to a criticality measure dened along the directions used throughout the iterations. Extensions of this result to obtain a true second-order optimality one are discussed, one possibility being a method using approximate Hessian eigenvectors as directions (which is proved to be truly second-order globally convergent). Numerically guar-anteeing such a convergence can be rather expensive to ensure, as it is indicated by the worst-case complexity analysis provided in this paper, but turns out to be appropriate for some pathological examples.

Worst Case Complexity of Direct Search under Convexity

by unknown authors , 2014
"... In this paper we prove that the broad class of direct-search methods of directional type, based on imposing sufficient decrease to accept new iterates, exhibits the same worst case complexity bound and global rate of the gradient method for the unconstrained minimization of a convex and smooth funct ..."
Abstract - Add to MetaCart
In this paper we prove that the broad class of direct-search methods of directional type, based on imposing sufficient decrease to accept new iterates, exhibits the same worst case complexity bound and global rate of the gradient method for the unconstrained minimization of a convex and smooth function. More precisely, it will be shown that the number of iterations needed to reduce the norm of the gradient of the objective function below a certain threshold is at most proportional to the inverse of the threshold. It will be also shown that the absolute error in the function values decay at a sublinear rate proportional to the inverse of the iteration counter. In addition, we prove that the sequence of absolute errors of function values and iterates converges r-linearly in the strongly convex case.
(Show Context)

Citation Context

... O(n2−3/2) for their adaptive cubic overestimation algorithm when using finite differences to approximate derivatives. In the non-smooth case, using smoothing techniques, both Garmanjani and Vicente =-=[6]-=- and Nesterov [12] established a WCC bound of approximately O(−3) iterations for their zero-order methods, where the threshold refers now to the gradient of a smoothed version of the original funct...

worst-case

by unknown authors , 2015
"... second-order globally convergent direct-search method and its ..."
Abstract - Add to MetaCart
second-order globally convergent direct-search method and its
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University