Results 1  10
of
12
Sensitivity analysis in linear programming and semidefinite programming using interiorpoint methods
 Cornell University
, 1999
"... We analyze perturbations of the righthand side and the cost parameters in linear programming (LP) and semidefinite programming (SDP). We obtain tight bounds on the norm of the perturbations that allow interiorpoint methods to recover feasible and nearoptimal solutions in a single interiorpoint i ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
We analyze perturbations of the righthand side and the cost parameters in linear programming (LP) and semidefinite programming (SDP). We obtain tight bounds on the norm of the perturbations that allow interiorpoint methods to recover feasible and nearoptimal solutions in a single interiorpoint iteration. For the unique, nondegenerate solution case in LP, we show that the bounds obtained using interiorpoint methods compare nicely with the bounds arising from the simplex method. We also present explicit bounds for SDP using the AHO, H..K..M, and NT directions.
A sensitivity analysis and a convergence result for a sequential semidefinite programming method, Numerical Analysis Manuscript No.
, 2003
"... ..."
Error Bounds for Linear Matrix Inequalities
, 1998
"... For iterative sequences that converge to the solution set of a linear matrix inequality, we show that the distance of the iterates to the solution set is at most O(ffl 2 \Gammad ). The nonnegative integer d is the socalled degree of singularity of the linear matrix inequality, and ffl denotes th ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
For iterative sequences that converge to the solution set of a linear matrix inequality, we show that the distance of the iterates to the solution set is at most O(ffl 2 \Gammad ). The nonnegative integer d is the socalled degree of singularity of the linear matrix inequality, and ffl denotes the amount of constraint violation in the iterate. For infeasible linear matrix inequalities, we show that the minimal norm of fflapproximate primal solutions is at least 1=O(ffl 1=(2 d \Gamma1) ), and the minimal norm of fflapproximate Farkas type dual solutions is at most O(1=ffl 2 d \Gamma1 ). As an application of these error bounds, we show that for any bounded sequence of fflapproximate solutions to a semidefinite programming problem, the distance to the optimal solution set is at most O(ffl 2 \Gammak ), where k is the degree of singularity of the optimal solution set. Keywords: semidefinite programming, error bounds, linear matrix inequality, regularized duality. AMS s...
Global Error Bounds for Convex Conic Problems
, 1998
"... In this paper Lipschitzian type error bounds are derived for general convex conic problems under various regularity conditions. Specifically, it is shown that if the recession directions satisfy Slater's condition then a global Lipschitzian type error bound holds. Alternatively, if the feasible ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
In this paper Lipschitzian type error bounds are derived for general convex conic problems under various regularity conditions. Specifically, it is shown that if the recession directions satisfy Slater's condition then a global Lipschitzian type error bound holds. Alternatively, if the feasible region is bounded, then the ordinary Slater condition guarantees a global Lipschitzian type error bound. These can be considered as generalizations of previously known results for inequality systems. Moreover, some of the results are also generalized to the intersection of multiple cones. Under Slater's condition alone, a global Lipschitzian type error bound may not hold. However, it is shown that such an error bound holds for a specific region. For linear systems we show that the constant involved in Hoffman's error bound can be estimated by the socalled condition number for linear programming. Key words: Error bound, convex conic problems, LMIs, condition number. AMS subject classification: 5...
A unifying optimal partition approach to sensitivity analysis in conic optimization
 Journal of Optimization Theory and Applications
, 2004
"... Abstract We study convex conic optimization problems in which the righthand side and the cost vectors vary linearly as a function of a scalar parameter. We present a unifying geometric framework that subsumes the concept of the optimal partition in linear programming (LP) and semidefinite programm ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract We study convex conic optimization problems in which the righthand side and the cost vectors vary linearly as a function of a scalar parameter. We present a unifying geometric framework that subsumes the concept of the optimal partition in linear programming (LP) and semidefinite programming (SDP) and extends it to conic optimization. Similar to the optimal partition approach to sensitivity analysis in LP and SDP, the range of perturbations for which the optimal partition remains constant can be computed by solving two conic optimization problems. Under a weaker notion of * Revised version of the former technical report "On Sensitivity Analysis in Conic Programming" dated October 22, 2001. † Department of Applied Mathematics and Statistics, Stony Brook University, Stony Brook, NY 117943600, USA. The author is supported in part by NSF through CAREER grant DMI0237415. (yildirim@ams.sunysb.edu) 1 nondegeneracy, this range is simply given by a minimum ratio test. We briefly discuss the properties of the optimal value function under such perturbations.
Report: 12T010ON IDENTIFICATION OF THE OPTIMAL PARTITION OF SECOND ORDER CONE OPTIMIZATION PROBLEMS
"... Abstract. This paper discusses the identification of the optimal partition of second order cone optimization (SOCO). By giving definitions of two condition numbers which only dependent on the SOCO problem itself, we derive some bounds on the magnitude of the blocks of variables along the central pat ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. This paper discusses the identification of the optimal partition of second order cone optimization (SOCO). By giving definitions of two condition numbers which only dependent on the SOCO problem itself, we derive some bounds on the magnitude of the blocks of variables along the central path, and prove that the optimal partition B, N, R, and T for SOCO problems can be identified along the central path when the barrier parameter µ is small enough. Then we generalize the results to a specific neighborhood of the central path.