• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

On proving existence of feasible points in equality constrained optimization problems. (1995)

by R B Kearfott
Venue:Mathematical Programming,
Add To MetaCart

Tools

Sorted by:
Results 1 - 10 of 12
Next 10 →

Global Optimization For Constrained Nonlinear Programming

by Tao Wang , 2001
"... In this thesis, we develop constrained simulated annealing (CSA), a global optimization algorithm that asymptotically converges to constrained global minima (CGM dn ) with probability one, for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary ..."
Abstract - Cited by 14 (2 self) - Add to MetaCart
In this thesis, we develop constrained simulated annealing (CSA), a global optimization algorithm that asymptotically converges to constrained global minima (CGM dn ) with probability one, for solving discrete constrained nonlinear programming problems (NLPs). The algorithm is based on the necessary and sufficient condition for constrained local minima (CLM dn ) in the theory of discrete constrained optimization using Lagrange multipliers developed in our group. The theory proves the equivalence between the set of discrete saddle points and the set of CLM dn, leading to the first-order necessary and sufficient condition for CLM dn. To find
(Show Context)

Citation Context

... subregions that may have good solutions while dropping unpromising subregions. In order to verify the existence of feasible points for a given subregion, they usually utilize interval Newton methods =-=[116, 115]-=-. GlobSol [80] and ILOG solver [108] are popular software packages that implement these methods. Interval methods, however, are unable to cope with general nonlinear constraints and may be misled by i...

Rigorous error bounds for the optimal value in semidefinite programming

by Christian Jansson, Denis Chaykin, Christian Keil - SIAM J. Numer. Anal
"... Abstract. A wide variety of problems in global optimization, combinatorial optimization as well as systems and control theory can be solved by using linear and semidefinite programming. Sometimes, due to the use of floating point arithmetic in combination with ill-conditioning and degeneracy, errone ..."
Abstract - Cited by 13 (4 self) - Add to MetaCart
Abstract. A wide variety of problems in global optimization, combinatorial optimization as well as systems and control theory can be solved by using linear and semidefinite programming. Sometimes, due to the use of floating point arithmetic in combination with ill-conditioning and degeneracy, erroneous results may be produced. The purpose of this article is to show how rigorous error bounds for the optimal value can be computed by carefully postprocessing the output of a linear or semidefinite programming solver. It turns out that in many cases the computational costs for postprocessing are small compared to the effort required by the solver. Numerical results are presented including problems from the SDPLIB and the NETLIB LP library; these libraries contain many ill-conditioned and real life problems.
(Show Context)

Citation Context

...degeneration. Hansen used this technique in order to prove existence of a feasible point for nonlinear equations within a bounded box. It was further modified and investigated numerically by Kearfott =-=[11]-=-, [12], and is also described in his book [13]. Corresponding algorithms are implemented in his software package GlobSol. 5. Certificate of Infeasibility. In branch and bound algorithms a subproblem i...

Globsol: History, composition, and advice on use

by R. Baker Kearfott - In Global Optimization and Constraint Satisfaction, Lecture Notes in Computer Science , 2003
"... Abstract. The GlobSol software package combines various ideas from interval analysis, automatic differentiation, and constraint propagation to provide verified solutions to unconstrained and constrained global optimization problems. After briefly reviewing some of these techniques and GlobSol’s deve ..."
Abstract - Cited by 10 (7 self) - Add to MetaCart
Abstract. The GlobSol software package combines various ideas from interval analysis, automatic differentiation, and constraint propagation to provide verified solutions to unconstrained and constrained global optimization problems. After briefly reviewing some of these techniques and GlobSol’s development history, we provide the first overall description of GlobSol’s algorithm. Giving advice on use, we point out strengths and weaknesses in GlobSol’s approaches. Through examples, we show how to configure and use GlobSol.
(Show Context)

Citation Context

...more complicated than evaluating the objective function at a point. In particular, the point of evaluation must be known to be feasible. We explain the process in GlobSol for verifying feasibility in =-=[18]-=- ands[16, §5.2.4]. This process is presently a weakness that prevents some equalityconstrained problems from being handled as efficiently as unconstrained or certain inequality-constrained problems. G...

Discussion and Empirical Comparisons of Linear Relaxations and Alternate Techniques in Validated Deterministic Global Optimization

by R. Baker Kearfott , 2004
"... VALIDATED GLOBAL OPTIMIZATION COMPARISONS 2 1 Introduction 1.1 The General Global Optimization Problem Our general global optimization problem can be stated as ..."
Abstract - Cited by 8 (0 self) - Add to MetaCart
VALIDATED GLOBAL OPTIMIZATION COMPARISONS 2 1 Introduction 1.1 The General Global Optimization Problem Our general global optimization problem can be stated as

Optimal Anytime Search For Constrained Nonlinear Programming

by Yixin Chen , 2001
"... In this thesis, we study optimal anytime stochastic search algorithms (SSAs) for solving general constrained nonlinear programming problems (NLPs) in discrete, continuous and mixed-integer space. The algorithms are general in the sense that they do not assume di#erentiability or convexity of functio ..."
Abstract - Cited by 6 (2 self) - Add to MetaCart
In this thesis, we study optimal anytime stochastic search algorithms (SSAs) for solving general constrained nonlinear programming problems (NLPs) in discrete, continuous and mixed-integer space. The algorithms are general in the sense that they do not assume di#erentiability or convexity of functions. Based on the search algorithms, we develop the theory of SSAs and propose optimal SSAs with iterative deepening in order to minimize their expected search time. Based on the optimal SSAs, we then develop optimal anytime SSAs that generate improved solutions as more search time is allowed. Our SSAs
(Show Context)

Citation Context

...nd whose functions are not di#erentiable. Hence, methods that require di#erentiability cannot be applied. Such methods include interval methods requiring derivatives (such as interval-Newton methods) =-=[101, 100, 87]-=-, gradient-descent methods, Newton's method [109, 121, 129], trajectory methods [158, 155, 58, 135, 30, 143, 149, 28, 152], covering methods requiring derivatives [38, 39], penalty methods requiring d...

Improved and simplified validation of feasible points: Inequality and equality constrained problems

by R. Baker Kearfott - Mathematical Programming, submitted , 2005
"... Abstract. In validated branch and bound algorithms for global optimization, upper bounds on the global optimum are obtained by evaluating the objective at an approximate optimizer; the upper bounds are then used to eliminate subregions of the search space. For constrained optimization, in general, a ..."
Abstract - Cited by 3 (1 self) - Add to MetaCart
Abstract. In validated branch and bound algorithms for global optimization, upper bounds on the global optimum are obtained by evaluating the objective at an approximate optimizer; the upper bounds are then used to eliminate subregions of the search space. For constrained optimization, in general, a small region must be constructed within which existence of a feasible point can be proven, and an upper bound on the objective over that region is obtained. We had previously proposed a perturbation technique for constructing such a region. In this work, we propose a much simplified and improved technique, based on an orthogonal decomposition of the normal space to the constraints. In purely inequality constrained problems, a point, rather than a region, can be used, and, for equality and inequality constrained problems, the region lies in a smaller-dimensional subspace, giving rise to sharper upper bounds. Numerical experiments on published test sets for global optimization provide evidence of the superiority of the new approach within our GlobSol environment. 1.
(Show Context)

Citation Context

...mate optimizer ˇx. However, because of likely singularity of this system, this technique is likely to fail; see the discussion in [5, §2] to gain insight into the case where the NLP (1) is linear. In =-=[3]-=-, we proposed and provided test results for a technique for perturbing the approximate optimizing point ˇx and by constructing a box about the perturbed feasible point within which an interval Newton ...

An iterative method for finding approximate feasible points

by R B Kearfott, J Dian , 1998
"... ..."
Abstract - Cited by 3 (2 self) - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...ntaining an approximate minimizer. However, in constrained problems such as Problem 1, a rigorous upper bound is obtained with this process only if it is certain that X contains a feasible point (See =-=[5]-=-.) Generally, if a good approximation to a feasible point (or solution of the constraints) is known, interval Newton methods can prove that an actual feasible point exists within specified bounds, at ...

Rigorous verification of feasibility

by Ferenc Domes, Arnold Neumaier
"... ..."
Abstract - Cited by 2 (1 self) - Add to MetaCart
Abstract not found

Constraint aggregation for rigorous global optimization

by Ferenc Domes , Arnold Neumaier , Arnold Neumaier
"... Abstract In rigorous constrained global optimization, upper bounds on the objective function help to reduce the search space. Their construction requires finding a narrow box around an approximately feasible solution, verified to contain a feasible point. Approximations are easily found by local op ..."
Abstract - Add to MetaCart
Abstract In rigorous constrained global optimization, upper bounds on the objective function help to reduce the search space. Their construction requires finding a narrow box around an approximately feasible solution, verified to contain a feasible point. Approximations are easily found by local optimization, but the verification often fails. In this paper we show that even if the verification of an approximate feasible point fails, the information extracted from the local optimization can still be used in many cases to reduce the search space. This is done by a rigorous filtering technique called constraint aggregation. It forms an aggregated redundant constraint, based on approximate Lagrange multipliers or on a vector valued measure of constraint violation. Using the optimality conditions, two sided linear relaxations, the GaussJordan algorithm and a directed modified Cholesky factorization, the information in the redundant constraint is turned into powerful bounds on the feasible set. Constraint aggregation is especially useful since it also works in a tiny neighborhood of the global optimizer, thereby reducing the cluster effect. A simple introductory example demonstrates how our new method works. Extensive tests show the performance on a large benchmark.
(Show Context)

Citation Context

... that Z(ν, x, y) := F ′(x)T (νa−BT y) ∈ z, (14) and the complementarity conditions max((BFj(x)− bj)yj , (BFj(x)− bj)yj) = 0 (15) hold for j = 1, . . . ,m. These conditions comprise the Karush-John optimality conditions for the problem (8); cf. the derivation and discussion of the history in Schichl and Neumaier [31]. If ν 6= 0 we may rescale the multipliers to have ν = 1, leading to the KuhnTucker optimality conditions. We normalize instead by rescaling so that max(ν, ‖y‖∞) = 1, which is possible even when ν = 0 and leads to bounded multipliers. This is achieved by ν ← 1max(1, ‖y‖∞) , y ← νy. (16) The following result suggests a way to define Lagrange multipliers y ∈ Rm (for the constraints) and ν ∈ R (for the objective) at an arbitrary point x (intended to be an approximate local minimizer). Theorem 1 If, for some x ∈ Rn, the constrained optimization problem min g(y) := ‖µ(Z(1, x, y)− z)‖22 s.t. y ∈ y (17) (with Z from (14) and µ from (6)) has a solution y with g(y) = 0 then (x, y) satisfies the Kuhn-Tucker conditions for (8), and (16) defines associated normalized multipliers satisfying the Karush-John conditions. Proof From g(y) = 0 follows that µ(Z(1, x, y) − z) = 0 implyin...

An Example of Singularity in Global Optimization

by R. Baker Kearfott
"... Abstract. Certain practical constrained global optimization problems have to date defied practical solution with interval branch and bound methods. The exact mechanism causing the difficulty has been difficult to pinpoint. Here, an example is given where the equality constraint set has higher-order ..."
Abstract - Add to MetaCart
Abstract. Certain practical constrained global optimization problems have to date defied practical solution with interval branch and bound methods. The exact mechanism causing the difficulty has been difficult to pinpoint. Here, an example is given where the equality constraint set has higher-order singularities and degenerate manifolds of singularities on the feasible set. The reason that this causes problems is discussed, and ways of fixing it are suggested.
(Show Context)

Citation Context

... found, then epsilon-inflation is used to construct a small box x, ˜x ∈ x in a subspace of R n within which an interval Newton method can prove there exists a true feasible point; see [5, §5.2.4] and =-=[6]-=-. Using default configuration and initial box x = (a1, a2, a3, x1, x2, x3) = ([−2, 2], [−3, 3], [−3, 3], [−2, 2], [−2, 2], [−2, 2]), the May 2, 2000 version of GlobSol did not complete after processin...

Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University