Results 1  10
of
49
On Lagrangian relaxation of quadratic matrix constraints
 SIAM J. MATRIX ANAL. APPL
, 2000
"... Quadratically constrained quadratic programs (QQPs) play an important modeling role for many diverse problems. These problems are in general NP hard and numerically intractable. Lagrangian relaxations often provide good approximate solutions to these hard problems. Such relaxations are equivalent ..."
Abstract

Cited by 53 (18 self)
 Add to MetaCart
(Show Context)
Quadratically constrained quadratic programs (QQPs) play an important modeling role for many diverse problems. These problems are in general NP hard and numerically intractable. Lagrangian relaxations often provide good approximate solutions to these hard problems. Such relaxations are equivalent to semidefinite programming relaxations. For several special cases of QQP, e.g., convex programs and trust region subproblems, the Lagrangian relaxation provides the exact optimal value, i.e., there is a zero duality gap. However, this is not true for the general QQP, or even the QQP with two convex constraints, but a nonconvex objective. In this paper we consider a certain QQP where the quadratic constraints correspond to the matrix orthogonality condition XXT = I. For this problem we show that the Lagrangian dual based on relaxing the constraints XXT = I and the seemingly redundant constraints XT X = I has a zero duality gap. This result has natural applications to quadratic assignment and graph partitioning problems, as well as the problem of minimizing the weighted sum of the largest eigenvalues of a matrix. We also show that the technique of relaxing quadratic matrix constraints can be used to obtain a strengthened semidefinite relaxation for the maxcut problem.
Generalized Lagrangian Duals and Sums of Squares Relaxations of Sparse Polynomial Optimization Problems
, 2004
"... Sequences of generalized Lagrangian duals and their SOS (sums of squares of polynomials) relaxations for a POP (polynomial optimization problem) are introduced. Sparsity of polynomials in the POP is used to reduce the sizes of the Lagrangian duals and their SOS relaxations. It is proved that the opt ..."
Abstract

Cited by 35 (20 self)
 Add to MetaCart
(Show Context)
Sequences of generalized Lagrangian duals and their SOS (sums of squares of polynomials) relaxations for a POP (polynomial optimization problem) are introduced. Sparsity of polynomials in the POP is used to reduce the sizes of the Lagrangian duals and their SOS relaxations. It is proved that the optimal values of the Lagrangian duals in the sequence converge to the optimal value of the POP using a method from the penalty function approach. The sequence of SOS relaxations is transformed into a sequence of SDP (semidefinite program) relaxations of the POP, which correspond to duals of modification and generalization of SDP relaxations given by Lasserre for the POP.
Discretization and Localization in Successive Convex Relaxation Methods for Nonconvex Quadratic Optimization
, 2000
"... . Based on the authors' previous work which established theoretical foundations of two, conceptual, successive convex relaxation methods, i.e., the SSDP (Successive Semidefinite Programming) Relaxation Method and the SSILP (Successive SemiInfinite Linear Programming) Relaxation Method, this pa ..."
Abstract

Cited by 26 (13 self)
 Add to MetaCart
. Based on the authors' previous work which established theoretical foundations of two, conceptual, successive convex relaxation methods, i.e., the SSDP (Successive Semidefinite Programming) Relaxation Method and the SSILP (Successive SemiInfinite Linear Programming) Relaxation Method, this paper proposes their implementable variants for general quadratic optimization problems. These problems have a linear objective function c T x to be maximized over a nonconvex compact feasible region F described by a finite number of quadratic inequalities. We introduce two new techniques, "discretization" and "localization," into the SSDP and SSILP Relaxation Methods. The discretization technique makes it possible to approximate an infinite number of semiinfinite SDPs (or semiinfinite LPs) which appeared at each iteration of the original methods by a finite number of standard SDPs (or standard LPs) with a finite number of linear inequality constraints. We establish: ffl Given any open convex ...
Second Order Cone Programming Relaxation of Nonconvex Quadratic Optimization Problems
"... ..."
(Show Context)
Conic mixedinteger rounding cuts
 University of CaliforniaBerkeley
, 2006
"... Abstract. A conic integer program is an integer programming problem with conic constraints. Many important problems in finance, engineering, statistical learning, and probabilistic optimization are modeled using conic constraints. Here we study mixedinteger sets defined by secondorder conic constr ..."
Abstract

Cited by 21 (4 self)
 Add to MetaCart
(Show Context)
Abstract. A conic integer program is an integer programming problem with conic constraints. Many important problems in finance, engineering, statistical learning, and probabilistic optimization are modeled using conic constraints. Here we study mixedinteger sets defined by secondorder conic constraints. We introduce generalpurpose cuts for conic mixedinteger programming based on polyhedral conic substructures of secondorder conic sets. These cuts can be readily incorporated in branchandbound algorithms that solve continuous conic programming or linear programming relaxations of conic integer programs at the nodes of the branchandbound tree. Central to our approach is a reformulation of the secondorder conic constraints with polyhedral secondorder conic constraints in a higher dimensional space. In this representation the cuts we develop are linear, even though they are nonlinear in the original space of variables. This feature leads to computationally efficient implementation of nonlinear cuts for conic mixedinteger programming. The reformulation also allows the use of polyhedral methods for conic integer programming. Our computational experiments show that conic mixedinteger rounding cuts are very effective in reducing the integrality gap of continuous relaxations of conic mixedinteger programs and, hence, improving their solvability.
A General Framework for Convex Relaxation of Polynomial Optimization Problems over Cones
 Journal of Operations Research Society of Japan
, 2002
"... The class of POPs (polynomial optimization problems) over cones covers a wide range of optimization problems such as 01 integer linear and quadratic programs, nonconvex quadratic programs and bilinear matrix inequalities. This paper presents a new framework for convex relaxation of POPs over cones ..."
Abstract

Cited by 20 (10 self)
 Add to MetaCart
(Show Context)
The class of POPs (polynomial optimization problems) over cones covers a wide range of optimization problems such as 01 integer linear and quadratic programs, nonconvex quadratic programs and bilinear matrix inequalities. This paper presents a new framework for convex relaxation of POPs over cones in terms of linear optimization problems over cones. It provides a unified treatment of many existing convex relaxation methods based on the liftandproject linear programming procedure, the reformulationlinearization technique and the semidefinite programming relaxation for a variety of problems. It also extends the theory of convex relaxation methods, and thereby brings flexibility and richness in practical use of the theory.
BranchandCut Algorithms for the Bilinear Matrix Inequality Eigenvalue Problem
 Comput. Optim. Appl
, 1999
"... The optimization problem with the Bilinear Matrix Inequality (BMI) is one of the problems which have greatly interested researchers of the control and system theory in the last few years. This inequality permits to reduce in a elegant way various problems of robust control into its form. However, on ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
The optimization problem with the Bilinear Matrix Inequality (BMI) is one of the problems which have greatly interested researchers of the control and system theory in the last few years. This inequality permits to reduce in a elegant way various problems of robust control into its form. However, on the contrary of the Linear Matrix Inequality (LMI) which can be solved by interiorpointmethods, the BMI is a computationally difficult object in theory and in practice. This article improves the branchandbound algorithm of Goh, Safonov and Papavassilopoulos (1995) by applying a better convex relaxation of the BMI Eigenvalue Problem (BMIEP), and proposes new BranchandBound and BranchandCut Algorithms. Numerical experiments were conducted in a systematic way over randomly generated problems, and they show the robustness and the efficiency of the proposed algorithms. Keywords: Bilinear Matrix Inequality, BranchandCut Algorithm, Convex Relaxation, Cut Polytope. y Author supported b...
Strong Duality for a TrustRegion Type Relaxation of the Quadratic Assignment Problem
, 1998
"... Lagrangian duality underlies many efficient algorithms for convex minimization problems. A key ingredient is strong duality. Lagrangian relaxation also provides lower bounds for nonconvex problems, where the quality of the lower bound depends on the duality gap. Quadratically constrained quadratic p ..."
Abstract

Cited by 15 (9 self)
 Add to MetaCart
Lagrangian duality underlies many efficient algorithms for convex minimization problems. A key ingredient is strong duality. Lagrangian relaxation also provides lower bounds for nonconvex problems, where the quality of the lower bound depends on the duality gap. Quadratically constrained quadratic programs (QQPs) provide important examples of nonconvex programs. For the simple case of one quadratic constraint (the trust region subproblem) strong duality holds. In addition, necessary and sufficient (strengthened) second order optimality conditions exist. However, these duality results already fail for the two trust region subproblem. Surprisingly, there are classes of more complex, nonconvex QQPs where strong duality holds. One example is the special case of orthogonality constraints, which arise naturally in relaxations for the quadratic assignment problem (QAP). In this paper we show that strong duality also holds for a relaxation of QAP where the orthogonality constraint is replaced ...
Semidefinite programming for discrete optimization and matrix completion problems
 Discrete Appl. Math
, 2002
"... Survey article for the proceedings of Discrete Optimization '99 where some of these results were presented as a plenary address. y ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
(Show Context)
Survey article for the proceedings of Discrete Optimization '99 where some of these results were presented as a plenary address. y
New Convex Relaxations for the Maximum Cut and VLSI Layout Problems
, 2001
"... It is well known that many of the optimization problems which arise in applications are “hard”, which usually means that they are NPhard. Hence much research has been devoted to finding “good” relaxations for these hard problems. Usually a “good” relaxation is one which can be solved (either exac ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
It is well known that many of the optimization problems which arise in applications are “hard”, which usually means that they are NPhard. Hence much research has been devoted to finding “good” relaxations for these hard problems. Usually a “good” relaxation is one which can be solved (either exactly or within a prescribed numerical tolerance) in polynomialtime. Nesterov and Nemirovskii showed that by this criterion, many convex optimization problems are good relaxations. This thesis presents new convex relaxations for two such hard problems, namely the MaximumCut (MaxCut) problem and the VLSI (Very Large Scale Integration of electronic circuits) layout problem. We derive and study the properties of two new strengthened semidefinite pro