Results 1 - 10
of
10
Discussion and Empirical Comparisons of Linear Relaxations and Alternate Techniques in Validated Deterministic Global Optimization
, 2004
"... VALIDATED GLOBAL OPTIMIZATION COMPARISONS 2 1 Introduction 1.1 The General Global Optimization Problem Our general global optimization problem can be stated as ..."
Abstract
-
Cited by 8 (0 self)
- Add to MetaCart
VALIDATED GLOBAL OPTIMIZATION COMPARISONS 2 1 Introduction 1.1 The General Global Optimization Problem Our general global optimization problem can be stated as
Validated linear relaxations and preprocessing: Some experiments
- SIAM J. OPTIM
, 2005
"... Based on work originating in the early 1970s, a number of recent global optimization algorithms have relied on replacing an original nonconvex nonlinear program by convex or linear relaxations. Such linear relaxations can be generated automatically through an automatic differentiation process. Thi ..."
Abstract
-
Cited by 5 (3 self)
- Add to MetaCart
Based on work originating in the early 1970s, a number of recent global optimization algorithms have relied on replacing an original nonconvex nonlinear program by convex or linear relaxations. Such linear relaxations can be generated automatically through an automatic differentiation process. This process decomposes the objective and constraints (if any) into convex and nonconvex unary and binary operations. The convex operations can be approximated arbitrarily well by appending additional constraints, while the domain must somehow be subdivided (in an overall branch-and-bound process or in some other local process) to handle nonconvex constraints. In general, a problem can be hard if even a single nonconvex term appears. However, certain nonconvex terms lead to easier-to-solve problems than others. Recently, Neumaier, Lebbah, Michel, ourselves, and others have paved the way to utilizing such techniques in a validated context. In this paper, we present a symbolic preprocessing step that provides a measure of the intrinsic difficulty of a problem. Based on this step, one of two methods can be chosen to relax nonconvex terms. This preprocessing step is similar to a method previously proposed by Epperly and Pistikopoulos [J. Global Optim., 11 (1997), pp. 287–311] for determining subspaces in which to branch, but we present it from a different point of view that is amenable to simplification of the problem presented to the linear programming solver, and within a validated context. Besides an illustrative example, we have implemented general relaxations in a validated context, as well as the preprocessing technique, and we present experiments on a standard test set. Finally, we present conclusions.
Validated bounds on basis vectors for the null space of a full rank rectangular matrix
, 2005
"... ..."
(Show Context)
Validated Constraint Solving – Practicalities, Pitfalls, and New Developments
"... Abstract. Many constraint propagation techniques iterate through the constraints in a straightforward manner, but can fail because they do not take account of the coupling between the constraints. However, some methods of taking account of this coupling are local in nature, and fail if the initial s ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
(Show Context)
Abstract. Many constraint propagation techniques iterate through the constraints in a straightforward manner, but can fail because they do not take account of the coupling between the constraints. However, some methods of taking account of this coupling are local in nature, and fail if the initial search region is too large. We put into perspective newer methods, based on linear relaxations, that can often replace brute-force search by solution of a large, sparse linear program. Robustness has been recognized as important in geometric computations and elsewhere for at least a decade, and more and more developers are including validation in the design of their systems. We provide citations to our work to-date in developing validated versions of linear relaxations. This work is in the form of a brief review and prospectus for future development. We give various simple examples to illustrate our points. Keywords: constraint propagation, global optimization, linear relaxations, Glob-Sol
RESEARCH ARTICLE Constraint Consensus Concentration for Identifying Disjoint Feasible Regions in Nonlinear Programs
"... It is usually not known in advance whether a nonlinear set of constraints has zero, one, or multiple feasible regions. Further, if one or more feasible regions exist, their locations are usually unknown. We propose a method for exploring the variable space quickly using Constraint Consensus to ident ..."
Abstract
- Add to MetaCart
(Show Context)
It is usually not known in advance whether a nonlinear set of constraints has zero, one, or multiple feasible regions. Further, if one or more feasible regions exist, their locations are usually unknown. We propose a method for exploring the variable space quickly using Constraint Consensus to identify promising areas that may contain a feasible region. Multiple Constraint Consensus solution points are clustered to identify regions of attraction. A new inter-point distance frequency distribution technique is used to determine the critical distance for the single linkage clustering algorithm, which in turn determines the estimated number of disjoint feasible regions. The effectiveness of multistart global optimization is increased due to better exploration of the variable space, and efficiency is also increased because the expensive local solver is launched just once near each identified feasible region. The method is demonstrated on a variety of highly nonlinear models.
PREPROCESSING FOR DETERMINING THE DIFFICULTY OF AND SELECTING A SOLUTION STRATEGY FOR NONCONVEX OPTIMIZATION PROBLEMS
, 2003
"... Based originally on work of McCormick, a number of recent global optimization algorithms have relied on replacing an original nonconvex nonlinear program by convex or linear relaxations. Such linear relaxations can be generated automatically through an automatic differentiation process. This proce ..."
Abstract
- Add to MetaCart
Based originally on work of McCormick, a number of recent global optimization algorithms have relied on replacing an original nonconvex nonlinear program by convex or linear relaxations. Such linear relaxations can be generated automatically through an automatic differentiation process. This process decomposes the objective and constraints (if any) into convex and nonconvex unary and binary operations. The convex operations can be approximated arbitrarily well by appending additional constraints, while the domain must somehow be subdivided (in an overall branch and bound process or in some other local process) to handle nonconvex constraints. In general, a problem can be hard if even a single nonconvex term appears. However, certain nonconvex terms lead to easier-to-solve problems than others. In this paper, we present a symbolic preprocessing step that provides a measure of the intrinsic difficulty of a problem. Based on this step, one of two methods can be chosen to relax nonconvex terms.
CONSTRUCTION OF VALIDATED UNIQUENESS REGIONS FOR NONLINEAR PROGRAMS IN WHICH CONVEX SUBSPACES HAVE BEEN Identified
, 2005
"... In deterministic global optimization algorithms for constrained problems, it can be advantageous to identify and utilize coordinates in which the problem is convex, as Epperly and Pistikopoulos have done. In self-validating versions of these algorithms, a useful technique is to construct regions a ..."
Abstract
- Add to MetaCart
(Show Context)
In deterministic global optimization algorithms for constrained problems, it can be advantageous to identify and utilize coordinates in which the problem is convex, as Epperly and Pistikopoulos have done. In self-validating versions of these algorithms, a useful technique is to construct regions about approximate optima, within which unique local optima are known to exist; these regions are to be as large as possible, for exclusion from the continuing search process. In this paper, we clarify the theory and develop algorithms for constructing such large regions, when we know the problem is convex in some of the variables. In addition, this paper clarifies how one can validate existence and uniqueness of local minima when using the Fritz John equations in the general case. We present numerical results that provide evidence of the efficacy of our techniques.
Noname manuscript No. (will be inserted by the editor) On Rigorous Upper Bounds to a Global Optimum
"... Abstract In branch and bound algorithms in constrained global optimization, a sharp upper bound on the global optimum is important for the overall efficiency of the branch and bound process. Software to find local optimizers, using floating point arithmetic, often computes an approximately feasible ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract In branch and bound algorithms in constrained global optimization, a sharp upper bound on the global optimum is important for the overall efficiency of the branch and bound process. Software to find local optimizers, using floating point arithmetic, often computes an approximately feasible point close to an actual global optimizer. Not mathematically rigorous algorithms can simply evaluate the objective at such points to obtain approximate upper bounds. However, such points may actually be slightly in-feasible, and the corresponding objective values may be slightly smaller than the global optimum. A consequence is that actual optimizers are occasionally missed, while the algorithm returns an approximate optimum and corresponding approximate optimizer that is occasionally far away from an actual global optimizer. In mathematically rig-orous algorithms, objective values are accepted as upper bounds only if the point of evaluation is proven to be feasible. Such computational proofs of feasibility have been weak points in mathematically rigorous algorithms. This paper first reviews previously proposed automatic proofs of feasibility, then proposes an alternative technique. The alternative technique is tried on a test set that caused trouble for previous techniques, and is also employed in a mathematically rigorous branch and bound algorithm on that test set.