Results 1 -
4 of
4
DIRECT SEARCH METHODS OVER LIPSCHITZ MANIFOLDS ∗
, 2007
"... We extend direct search methods to optimization problems that include equality constraints given by Lipschitz functions. The equality constraints are assumed to implicitly define a Lipschitz manifold. Numerically implementing the inverse (implicit) function theorem allows us to define a new problem ..."
Abstract
-
Cited by 1 (1 self)
- Add to MetaCart
(Show Context)
We extend direct search methods to optimization problems that include equality constraints given by Lipschitz functions. The equality constraints are assumed to implicitly define a Lipschitz manifold. Numerically implementing the inverse (implicit) function theorem allows us to define a new problem on the tangent spaces of the manifold. We can then use a direct search method on the tangent spaces to solve the new optimization problem without any equality constraints. Solving this related problem implicitly solves the original optimization problem. Our main example utilizes the LTMADS algorithm for the direct search method. However, other direct search methods can be employed. Convergence results trivially carry over to our new procedure under mild assumptions.
A Derivative-Free CoMirror Algorithm for Convex Optimization
, 2014
"... We consider the minimization of a nonsmooth convex function over a compact convex set subject to a nonsmooth convex constraint. We work in the setting of derivative-free optimization (DFO), assuming that the objective and constraint functions are available through a black-box that provides function ..."
Abstract
- Add to MetaCart
We consider the minimization of a nonsmooth convex function over a compact convex set subject to a nonsmooth convex constraint. We work in the setting of derivative-free optimization (DFO), assuming that the objective and constraint functions are available through a black-box that provides function values for lower-C2 representation of the functions. Our approach is based on a DFO adaptation of the -comirror algorithm [6]. Algorithmic convergence hinges on the ability to accurately approximate subgradients of lower-C2 functions, which we prove is possible through linear interpolation. We show that, if the sampling radii for linear interpolation are properly selected, then the new algorithm has the same convergence rate as the original gradient-based algorithm. This provides a novel global rate-of-convergence result for nonsmooth convex DFO with nonsmooth convex constraints. We conclude with numerical testing that demonstrates the practical feasibility of the algorithm and some directions for further research.
A SIMPLICIAL CONTINUATION DIRECT SEARCH METHOD
, 2007
"... A direct search method for the class of problems considered by Lewis and Torczon [SIAM J. Optim., 12 (2002), pp. 1075-1089] is developed. Instead of using an augmented Lagrangian method, a simplicial approximation method to the feasible set is implicitly employed. This allows the points our algorith ..."
Abstract
- Add to MetaCart
(Show Context)
A direct search method for the class of problems considered by Lewis and Torczon [SIAM J. Optim., 12 (2002), pp. 1075-1089] is developed. Instead of using an augmented Lagrangian method, a simplicial approximation method to the feasible set is implicitly employed. This allows the points our algorithm considers to conveniently remain within an a priori specified distance of the feasible set. In the limit, a positive spanning set is constructed in the tangent plane to the feasible set of every cluster point. In this way, we can guarantee that every cluster point is a stationary point for the objective function restricted to the feasible set.
A DIRECT SEARCH METHOD FOR WORST CASE ANALYSIS AND YIELD OPTIMIZATION OF INTEGRATED CIRCUITS
, 2009
"... Yield maximization is an important aspect in the design of integrated circuits. A prerequisite for its automation is a reliable and fast worst performance analysis which results in corners that can be used in the process of circuit optimization. We formulate the constrained optimization problem for ..."
Abstract
- Add to MetaCart
(Show Context)
Yield maximization is an important aspect in the design of integrated circuits. A prerequisite for its automation is a reliable and fast worst performance analysis which results in corners that can be used in the process of circuit optimization. We formulate the constrained optimization problem for finding the worst performance of an integrated circuit and develop a direct search method for solving it. The algorithm uses radial steps and rotations for enforcing the inequality constraint. We demonstrate the performance of the proposed algorithm on real world design examples of integrated circuits. The results indicate that the algorithm solves the worst performance problem in an efficient manner. The proposed algorithm was also successfully used in the process of yield maximization, resulting in a 99.65 % yield.