Results 1  10
of
26
Using SeDuMi 1.02, a MATLAB toolbox for optimization over symmetric cones
, 1998
"... SeDuMi is an addon for MATLAB, that lets you solve optimization problems with linear, quadratic and semidefiniteness constraints. It is possible to have complex valued data and variables in SeDuMi. Moreover, large scale optimization problems are solved efficiently, by exploiting sparsity. This pape ..."
Abstract

Cited by 1368 (5 self)
 Add to MetaCart
SeDuMi is an addon for MATLAB, that lets you solve optimization problems with linear, quadratic and semidefiniteness constraints. It is possible to have complex valued data and variables in SeDuMi. Moreover, large scale optimization problems are solved efficiently, by exploiting sparsity. This paper describes how to work with this toolbox.
Using SeDuMi 1.0x , A Matlab TOOLBOX FOR OPTIMIZATION OVER SYMMETRIC CONES
, 1999
"... SeDuMi is an addon for MATLAB, which lets you solve optimization problems with linear, quadratic and semidefiniteness constraints. It is possible to have complex valued data and variables in SeDuMi. Moreover, large scale optimization problems are solved efficiently, by exploiting sparsity. This p ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
SeDuMi is an addon for MATLAB, which lets you solve optimization problems with linear, quadratic and semidefiniteness constraints. It is possible to have complex valued data and variables in SeDuMi. Moreover, large scale optimization problems are solved efficiently, by exploiting sparsity. This paper describes how to work with this toolbox.
Convex Optimization Problems Involving Finite autocorrelation sequences
, 2001
"... We discuss convex optimization problems where some of the variables are constrained to be finite autocorrelation sequences. Problems of this form arise in signal processing and communications, and we describe applications in filter design and system identification. Autocorrelation constraints in opt ..."
Abstract

Cited by 40 (2 self)
 Add to MetaCart
We discuss convex optimization problems where some of the variables are constrained to be finite autocorrelation sequences. Problems of this form arise in signal processing and communications, and we describe applications in filter design and system identification. Autocorrelation constraints in optimization problems are often approximated by sampling the corresponding power spectral density, which results in a set of linear inequalities. They can also be cast as linear matrix inequalities via the KalmanYakubovichPopov lemma. The linear matrix inequality formulation is exact, and results in convex optimization problems that can be solved using interiorpoint methods for semidefinite programming. However, it has an important drawback: to represent an autocorrelation sequence of length n, it requires the introduction of a large number (n(n + 1)/2) of auxiliary variables. This results in a high computational cost when generalpurpose semidefinite programming solvers are used. We present a more efficient implementation based on duality and on interiorpoint methods for convex problems with generalized linear inequalities.
On the closedness of the linear image of a closed convex cone
, 1992
"... informs doi 10.1287/moor.1060.0242 ..."
Conic convex programming and selfdual embedding.
 Optimization Methods and Software,
, 2000
"... ..."
(Show Context)
On sensitivity of central solutions in semidefinite programming
 MATH. PROGRAM
, 1998
"... In this paper we study the properties of the analytic central path of a semidefinite programming problem under perturbation of a set of input parameters. Specifically, we analyze the behavior of solutions on the central path with respect to changes on the right hand side of the constraints, includin ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
(Show Context)
In this paper we study the properties of the analytic central path of a semidefinite programming problem under perturbation of a set of input parameters. Specifically, we analyze the behavior of solutions on the central path with respect to changes on the right hand side of the constraints, including the limiting behavior when the central optimal solution is approached. Our results are of interest for the sake of numerical analysis, sensitivity analysis and parametric programming. Under the primaldual Slater condition and the strict complementarity condition we show that the derivatives of central solutions with respect to the right hand side parameters converge as the path tends to the central optimal solution. Moreover, the derivatives are bounded, i.e. a Lipschitz constant exists. This Lipschitz constant can be thought of as a condition number for the semidefinite programming problem. It is a generalization of the familiar condition number for linear equation systems and linear programming problems. However, the generalized condition number depends on the right hand side parameters as well, whereas it is wellknown that in the linear programming case the condition number depends only on the constraint matrix. We demonstrate that the existence of strictly complementary solutions is important for the Lipschitz constant to exist. Moreover, we give an example in which the set of right hand side parameters for which the strict complementarity condition holds is neither open nor closed. This is remarkable since a similar set for which the primaldual Slater condition holds is always open.
A New SelfDual Embedding Method for Convex Programming
 Journal of Global Optimization
, 2001
"... In this paper we introduce a conic optimization formulation for inequalityconstrained convex programming, and propose a selfdual embedding model for solving the resulting conic optimization problem. The primal and dual cones in this formulation are characterized by the original constraint function ..."
Abstract

Cited by 11 (4 self)
 Add to MetaCart
(Show Context)
In this paper we introduce a conic optimization formulation for inequalityconstrained convex programming, and propose a selfdual embedding model for solving the resulting conic optimization problem. The primal and dual cones in this formulation are characterized by the original constraint functions and their corresponding conjugate functions respectively. Hence they are completely symmetric. This allows for a standard primaldual path following approach for solving the embedded problem. Moreover, there are two immediate logarithmic barrier functions for the primal and dual cones. We show that these two logarithmic barrier functions are conjugate to each other. The explicit form of the conjugate functions are in fact not required to be known in the algorithm. An advantage of the new approach is that there is no need to assume an initial feasible solution to start with. To guarantee the polynomiality of the pathfollowing procedure, we may apply the selfconcordant barrier theory of Nesterov and Nemirovski. For this purpose, as one application, we prove that the barrier functions constructed this way are indeed selfconcordant when the original constraint functions are convex and quadratic. Keywords: Convex Programming, Convex Cones, SelfDual Embedding, SelfConcordant Barrier Functions. # Department of Systems Engineering and Engineering Management, The Chinese University of Hong Kong, Shatin, Hong Kong. Research supported by Hong Kong RGC Earmarked Grants CUHK4181/00E and CUHK4233/01E. 1 1
A Simple Derivation of a Facial Reduction Algorithm and Extended Dual Systems
"... The Facial Reduction Algorithm (FRA) of Borwein and Wolkowicz and the Extended Dual System (EDS) of Ramana aim to better understand duality, when a conic linear system Ax K b (P) has no strictly feasible solution. We ffl provide a simple proof of the correctness of a variant of FRA. ffl show how ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
(Show Context)
The Facial Reduction Algorithm (FRA) of Borwein and Wolkowicz and the Extended Dual System (EDS) of Ramana aim to better understand duality, when a conic linear system Ax K b (P) has no strictly feasible solution. We ffl provide a simple proof of the correctness of a variant of FRA. ffl show how it naturally leads to the validity of a family of extended dual systems. ffl Summarize, which subsets of K related to the system (P) (as the minimal cone and its dual) have an extended representation. 1 Introduction Farkas' lemma assuming a CQ Duality results for the conic linear system Ax K b (P) 1 A Facial Reduction Algorithm and Extended Dual Systems 2 are usually derived assuming some constraint qualification (CQ). The most frequently used CQ is strict feasibility, ie. assuming the existence of a x with A x ! K b. Here K is a closed convex cone, A : X ! Y a linear operator, with X and Y being euclidean spaces. We write z K y, and z ! K y to mean that y  z is in K, or in ri K...
Facial reduction algorithms for conic optimization problems
 Journal of Optimization Theory and Applications
"... To obtain a primaldual pair of conic programming problems having zero duality gap, two methods have been proposed: the facial reduction algorithm due to Borwein and Wolkowicz [1, 2] and the conic expansion method due to Luo, Sturm, and Zhang [5]. We establish a clear relationship between them. Our ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
(Show Context)
To obtain a primaldual pair of conic programming problems having zero duality gap, two methods have been proposed: the facial reduction algorithm due to Borwein and Wolkowicz [1, 2] and the conic expansion method due to Luo, Sturm, and Zhang [5]. We establish a clear relationship between them. Our results show that although the two methods can be regarded as dual to each other, the facial reduction algorithm can produce a finer sequence of faces including the feasible region. We illustrate the facial reduction algorithm in LP, SOCP and an example of SDP. A simple proof of the convergence of the facial reduction algorithm for conic programming is also presented.