Results 1  10
of
22
Implementation of interior  point methods for large scale linear programs,
 In Interior Point Methods of Mathematical Programming,
, 1996
"... ..."
(Show Context)
PrimalDual TargetFollowing Algorithms for Linear Programming
 ANNALS OF OPERATIONS RESEARCH
, 1993
"... In this paper we propose a method for linear programming with the property that, starting from an initial noncentral point, it generates iterates that simultaneously get closer to optimality and closer to centrality. The iterates follow paths that in the limit are tangential to the central path. Al ..."
Abstract

Cited by 25 (1 self)
 Add to MetaCart
In this paper we propose a method for linear programming with the property that, starting from an initial noncentral point, it generates iterates that simultaneously get closer to optimality and closer to centrality. The iterates follow paths that in the limit are tangential to the central path. Along with the convergence analysis we provide a general framework which enables us to analyze various primaldual algorithms in the literature in a short and uniform way.
Combining InteriorPoint and Pivoting Algorithms for Linear Programming
 Management Science
, 1996
"... ..."
A polynomial primaldual Dikintype algorithm for linear programming
 FACULTY OF TECHNICAL MATHEMATICS AND COMPUTER SCIENCE, DELFT UNIVERSITY OF TECHNOLOGY
, 1993
"... In this paper we present a new primaldual affine scaling method for linear programming. The method yields a strictly complementary optimal solution pair, and also allows a polynomialtime convergence proof. The search direction is obtained by using the original idea of Dikin, namely by minimizing t ..."
Abstract

Cited by 19 (9 self)
 Add to MetaCart
In this paper we present a new primaldual affine scaling method for linear programming. The method yields a strictly complementary optimal solution pair, and also allows a polynomialtime convergence proof. The search direction is obtained by using the original idea of Dikin, namely by minimizing the objective function (which is the duality gap in the primaldual case), over some suitable ellipsoid. This gives rise to completely new primaldual affine scaling directions, having no obvious relation with the search directions proposed in the literature so far. The new directions guarantee a significant decrease in the duality gap in each iteration, and at the same time they drive the iterates to the central path. In the analysis of our algorithm we use a barrier function which is the natural primaldual generalization of Karmarkar's potential function. The iteration bound is O(nL), which is a factor O(L) better than the iteration bound of an earlier primaldual affine scaling meth...
A Computational View of InteriorPoint Methods for Linear Programming
 IN: ADVANCES IN LINEAR AND INTEGER PROGRAMMING
, 1994
"... Many issues that are crucial for an efficient implementation of an interior point algorithm are addressed in this paper. To start with, a prototype primaldual algorithm is presented. Next, many tricks that make it so efficient in practice are discussed in detail. Those include: the preprocessing te ..."
Abstract

Cited by 17 (10 self)
 Add to MetaCart
Many issues that are crucial for an efficient implementation of an interior point algorithm are addressed in this paper. To start with, a prototype primaldual algorithm is presented. Next, many tricks that make it so efficient in practice are discussed in detail. Those include: the preprocessing techniques, the initialization approaches, the methods of computing search directions (and lying behind them linear algebra techniques), centering strategies and methods of stepsize selection. Several reasons for the manifestations of numerical difficulties like e.g.: the primal degeneracy of optimal solutions or the lack of feasible solutions are explained in a comprehensive way. A motivation for obtaining an optimal basis is given and a practicable algorithm to perform this task is presented. Advantages of different methods to perform postoptimal analysis (applicable to interior point optimal solutions) are discussed. Important questions that still remain open in the implementations of i...
The Optimal Set and Optimal Partition Approach to Linear and Quadratic Programming
 in Advances in Sensitivity Analysis and Parametric Programming
, 1996
"... In this chapter we describe the optimal set approach for sensitivity analysis for LP. We show that optimal partitions and optimal sets remain constant between two consecutive transitionpoints of the optimal value function. The advantage of using this approach instead of the classical approach (usin ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
(Show Context)
In this chapter we describe the optimal set approach for sensitivity analysis for LP. We show that optimal partitions and optimal sets remain constant between two consecutive transitionpoints of the optimal value function. The advantage of using this approach instead of the classical approach (using optimal bases) is shown. Moreover, we present an algorithm to compute the partitions, optimal sets and the optimal value function. This is a new algorithm and uses primal and dual optimal solutions. We also extend some of the results to parametric quadratic programming, and discuss differences and resemblances with the linear programming case.
Simultaneous primaldual righthandside sensitivity analysis from a strictly complementary solution of a linear program
 SIAM J. Optim
, 2006
"... Abstract. This paper establishes theorems about the simultaneous variation of righthand sides and cost coefficients in a linear program from a strictly complementary solution. Some results are extensions of those that have been proven for varying the righthand side of the primal or the dual, but n ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
(Show Context)
Abstract. This paper establishes theorems about the simultaneous variation of righthand sides and cost coefficients in a linear program from a strictly complementary solution. Some results are extensions of those that have been proven for varying the righthand side of the primal or the dual, but not both; other results are new. In addition, changes in the optimal partition and what that means in economic terms are related to the basisdriven approach, notably to the theory of compatibility. In addition to new theorems about this relation, the transition graph is extended to provide another visualization of the underlying economics.
Sensitivity Analysis in (Degenerate) Quadratic Programming
 DELFT UNIVERSITY OF TECHNOLOGY
, 1996
"... In this paper we deal with sensitivity analysis in convex quadratic programming, without making assumptions on nondegeneracy, strict convexity of the objective function, and the existence of a strictly complementary solution. We show that the optimal value as a function of a righthand side element ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
In this paper we deal with sensitivity analysis in convex quadratic programming, without making assumptions on nondegeneracy, strict convexity of the objective function, and the existence of a strictly complementary solution. We show that the optimal value as a function of a righthand side element (or an element of the linear part of the objective) is piecewise quadratic, where the pieces can be characterized by maximal complementary solutions and tripartitions. Further, we investigate differentiability of this function. A new algorithm to compute the optimal value function is proposed. Finally, we discuss the advantages of this approach when applied to meanvariance portfolio models.
Matrix Sensitivity Analysis from an Interior Solution of a Linear Program
 INFORMS J. Comput
, 1997
"... This paper considers the effect of changing matrix coefficients in a linear program after we have obtained an interior solution. Changes are restricted to where there remains an optimal solution to the perturbed problem (called "admissible "). Mills' minimax theorem provides one appro ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
(Show Context)
This paper considers the effect of changing matrix coefficients in a linear program after we have obtained an interior solution. Changes are restricted to where there remains an optimal solution to the perturbed problem (called "admissible "). Mills' minimax theorem provides one approach and has been used for similar sensitivity analysis from a basic optimum. Here we consider the effect on the optimal partition and how the analysis results relate to the classical approach that uses a basic solution. Keywords: linear programming, sensitivity analysis, optimal partition, interior solution, computational economics. Contents 1 Introduction 1 2 Technical Background 1 3 Basic Ranges 6 4 Partition Invariance 13 5 An Example 20 6 Concluding Comments 29 References 1 INTRODUCTION 1 1 Introduction Consider the primaldual linear programs: minfcx : x 0; Ax bg maxfyb : y 0; yA cg; where c is a row vector in IR n , called objective coefficients; x is a column vector in IR n , called l...
Rim Sensitivity Analysis From An Interior Solution
, 1996
"... . This establishes theorems about the simultaneous variation of righthand sides and cost coefficients in a linear program from an interior solution. Some results are extensions of those that have been proven for varying the righthand side of the primal or the dual, but not both; other results are ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
. This establishes theorems about the simultaneous variation of righthand sides and cost coefficients in a linear program from an interior solution. Some results are extensions of those that have been proven for varying the righthand side of the primal or the dual, but not both; other results are new. In addition, changes in the optimal partition and what that means in economic terms are related to the basisdriven approach, notably to the Theory of Compatibility. In addition to new theorems about this relation, the transition graph is extended to provide another visualization of the underlying economics. Key words. Linear programming, sensitivity analysis, computational economics, interior point methods, parametric programming, optimal partition. 1. Introduction. Consider the primaldual pair of linear programs: P : minfcx : x 0; Ax bg D : maxfßb : ß 0; ßA cg; where c is a row vector in R n , called objective coefficients; x is a column vector in R n , called levels; b is...