Results 1  10
of
35
INTERIOR PATH FOLLOWING PRIMALDUAL ALGORITHMS. PART I: LINEAR PROGRAMMING
, 1989
"... We describe a primaldual interior point algorithm for linear programming problems which requires a total of O(~fnL) number of iterations, where L is the input size. Each iteration updates a penalty parameter and finds the Newton direction associated with the KarushKuhnTucker system of equations w ..."
Abstract

Cited by 198 (11 self)
 Add to MetaCart
We describe a primaldual interior point algorithm for linear programming problems which requires a total of O(~fnL) number of iterations, where L is the input size. Each iteration updates a penalty parameter and finds the Newton direction associated with the KarushKuhnTucker system of equations which characterizes a solution of the logarithmic barrier function problem. The algorithm is based on the path following idea.
HOMOTOPY CONTINUATION METHODS FOR NONLINEAR COMPLEMENTARITY PROBLEMS
, 1991
"... A complementarity problem with a continuous mapping f from Rn into itself can be written as the system of equations F(x, y) = 0 and (x, y)> 0. Here F is the mapping from R ~ " into itself defined by F(x, y) = ( xl y,, x2yZ,..., x, ~ ye, y ffx)). Under the assumption that the mapping f is ..."
Abstract

Cited by 43 (3 self)
 Add to MetaCart
A complementarity problem with a continuous mapping f from Rn into itself can be written as the system of equations F(x, y) = 0 and (x, y)> 0. Here F is the mapping from R ~ " into itself defined by F(x, y) = ( xl y,, x2yZ,..., x, ~ ye, y ffx)). Under the assumption that the mapping f is a P,,function, we study various aspects of homotopy continuation methods that trace a trajectory consisting of solutions of the family of systems of equations F(x, y) = t(a, b) and (x, y) 8 0 until the parameter t> 0 attains 0. Here (a, b) denotes a 2ndimensional constant positive vector. We establish the existence of a trajectory which leads to a solution of the problem, and then present a numerical method for tracing the trajectory. We also discuss the global and local convergence of the method.
Advances in convex optimization: Conic programming
 In Proceedings of International Congress of Mathematicians
, 2007
"... Abstract. During the last two decades, major developments in convex optimization were focusing on conic programming, primarily, on linear, conic quadratic and semidefinite optimization. Conic programming allows to reveal rich structure which usually is possessed by a convex program and to exploit ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
(Show Context)
Abstract. During the last two decades, major developments in convex optimization were focusing on conic programming, primarily, on linear, conic quadratic and semidefinite optimization. Conic programming allows to reveal rich structure which usually is possessed by a convex program and to exploit this structure in order to process the program efficiently. In the paper, we overview the major components of the resulting theory (conic duality and primaldual interior point polynomial time algorithms), outline the extremely rich “expressive abilities ” of conic quadratic and semidefinite programming and discuss a number of instructive applications.
Interiorpoint methods for optimization
, 2008
"... This article describes the current state of the art of interiorpoint methods (IPMs) for convex, conic, and general nonlinear optimization. We discuss the theory, outline the algorithms, and comment on the applicability of this class of methods, which have revolutionized the field over the last twen ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
This article describes the current state of the art of interiorpoint methods (IPMs) for convex, conic, and general nonlinear optimization. We discuss the theory, outline the algorithms, and comment on the applicability of this class of methods, which have revolutionized the field over the last twenty years.
Projective Transformations for Interior Point Algorithms, and a Superlinearly Convergent Algorithm for the WCenter Problem
"... The purpose of this study is to broaden the scope of projective transformation methods in mathematical programming, both in terms of theory and algorithms. We start by generalizing the concept of the analytic center of a polyhedral system of constraints to the wcenter of a polyhedral system, which ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
(Show Context)
The purpose of this study is to broaden the scope of projective transformation methods in mathematical programming, both in terms of theory and algorithms. We start by generalizing the concept of the analytic center of a polyhedral system of constraints to the wcenter of a polyhedral system, which stands for weighted center, where there are positive weights on the logarithmic barrier terms for reach inequality constraint defining the polyhedron X. We prove basic results regarding contained and containing ellipsoids centered at the wcenter of the system X. We next shift attention to projective transformations, and we exhibit an elementary projective transformation that transforms the polyhedron X to another polyhedron Z, and that transforms the current interior point to the wcenter of the transformed polyhedron Z. We work throughout with a polyhedral system of the most general form, namely both inequality and equality costraints. This theory is then applied to the problem of finding the wcenter of a polyhedral system X. We present a projective transformation algorithm, which is
Degeneracy in Interior Point Methods for Linear Programming
, 1991
"... ... In this paper, we survey the various theoretical and practical issues related to degeneracy in IPM's for linear programming. We survey results which for the most part already appeared in the literature. Roughly speaking, we shall deal with four topics: the effect of degeneracy on the conver ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
... In this paper, we survey the various theoretical and practical issues related to degeneracy in IPM's for linear programming. We survey results which for the most part already appeared in the literature. Roughly speaking, we shall deal with four topics: the effect of degeneracy on the convergence of IPM's, on the trajectories followed by the algorithms, the effect of degeneracy in numerical performance, and on finding basic solutions.
An Interior Point Potential Reduction Method for Constrained Equations
, 1995
"... We study the problem of solving a constrained system of nonlinear equations by a combination of the classical damped Newton method for (unconstrained) smooth equations and the recent interior point potential reduction methods for linear programs, linear and nonlinear complementarity problems. In gen ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
We study the problem of solving a constrained system of nonlinear equations by a combination of the classical damped Newton method for (unconstrained) smooth equations and the recent interior point potential reduction methods for linear programs, linear and nonlinear complementarity problems. In general, constrained equations provide a unified formulation for many mathematical programming problems, including complementarity problems of various kinds and the KarushKuhnTucker systems of variational inequalities and nonlinear programs. Combining ideas from the damped Newton and interior point methods, we present an iterative algorithm for solving a constrained system of equations and investigate its convergence properties. Specialization of the algorithm and its convergence analysis to complementarity problems of various kinds and the KarushKuhnTucker systems of variational inequalities are discussed in detail. We also report the computational results of the implementation of the algo...
Smoothed Analysis of Condition Numbers and Complexity Implications for Linear Programming
, 2009
"... We perform a smoothed analysis of Renegar’s condition number for linear programming by analyzing the distribution of the distance to illposedness of a linear program subject to a slight Gaussian perturbation. In particular, we show that for every nbyd matrix Ā, nvector ¯ b, and dvector ¯c satis ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
We perform a smoothed analysis of Renegar’s condition number for linear programming by analyzing the distribution of the distance to illposedness of a linear program subject to a slight Gaussian perturbation. In particular, we show that for every nbyd matrix Ā, nvector ¯ b, and dvector ¯c satisfying ∥ ∥ Ā, ¯ b, ¯c ∥ ∥ F ≤ 1 and every σ ≤ 1, E [log C(A, b, c)] = O(log(nd/σ)), A,b,c where A, b and c are Gaussian perturbations of Ā, ¯ b and ¯c of variance σ 2 and C(A, b, c) is the condition number of the linear program defined by (A, b, c). From this bound, we obtain a smoothed analysis of interior point algorithms. By combining this with the smoothed analysis of finite termination of Spielman and Teng (Math. Prog. Ser. B, 2003), we show that the smoothed complexity of interior point algorithms for linear programming is O(n 3 log(nd/σ)).