Results 1  10
of
22
Interiorpoint Methods
, 2000
"... The modern era of interiorpoint methods dates to 1984, when Karmarkar proposed his algorithm for linear programming. In the years since then, algorithms and software for linear programming have become quite sophisticated, while extensions to more general classes of problems, such as convex quadrati ..."
Abstract

Cited by 607 (15 self)
 Add to MetaCart
(Show Context)
The modern era of interiorpoint methods dates to 1984, when Karmarkar proposed his algorithm for linear programming. In the years since then, algorithms and software for linear programming have become quite sophisticated, while extensions to more general classes of problems, such as convex quadratic programming, semidefinite programming, and nonconvex and nonlinear problems, have reached varying levels of maturity. We review some of the key developments in the area, including comments on both the complexity theory and practical algorithms for linear programming, semidefinite programming, monotone linear complementarity, and convex programming over sets that can be characterized by selfconcordant barrier functions.
Fast Optimization Methods for L1 Regularization: A Comparative Study and Two New Approaches
"... Abstract. L1 regularization is effective for feature selection, but the resulting optimization is challenging due to the nondifferentiability of the 1norm. In this paper we compare stateoftheart optimization techniques to solve this problem across several loss functions. Furthermore, we propose ..."
Abstract

Cited by 84 (2 self)
 Add to MetaCart
(Show Context)
Abstract. L1 regularization is effective for feature selection, but the resulting optimization is challenging due to the nondifferentiability of the 1norm. In this paper we compare stateoftheart optimization techniques to solve this problem across several loss functions. Furthermore, we propose two new techniques. The first is based on a smooth (differentiable) convex approximation for the L1 regularizer that does not depend on any assumptions about the loss function used. The other technique is a new strategy that addresses the nondifferentiability of the L1regularizer by casting the problem as a constrained optimization problem that is then solved using a specialized gradient projection method. Extensive comparisons show that our newly proposed approaches consistently rank among the best in terms of convergence speed and efficiency by measuring the number of function evaluations required. 1
A simple polynomialtime rescaling algorithm for solving linear programs
 In Proceedings of the 36th Annual ACM Symposium on Theory of Computing (STOC
, 2004
"... We show that the perceptron algorithm along with periodic rescaling solves linear programs in polynomial time. The algorithm requires no matrix inversions and no barrier functions. 1 ..."
Abstract

Cited by 48 (5 self)
 Add to MetaCart
(Show Context)
We show that the perceptron algorithm along with periodic rescaling solves linear programs in polynomial time. The algorithm requires no matrix inversions and no barrier functions. 1
Implementation of interior point methods for mixed semidefinite and second order cone optimization problems
 Optimization Methods and Software
"... There is a large number of implementational choices to be made for the primaldual interior point method in the context of mixed semidefinite and second order cone optimization. This paper presents such implementational issues in a unified framework, and compares the choices made by different resear ..."
Abstract

Cited by 41 (0 self)
 Add to MetaCart
(Show Context)
There is a large number of implementational choices to be made for the primaldual interior point method in the context of mixed semidefinite and second order cone optimization. This paper presents such implementational issues in a unified framework, and compares the choices made by different research groups. This is also the first paper to provide an elaborate discussion of the implementation in SeDuMi.
A global and local superlinear continuationsmoothing method for P0 + R0 and monotone NCP
 SIAM Journal Optimization
, 1997
"... ..."
A global linear and local quadratic continuation smoothing method for variational inequalities with box constraints.
, 1997
"... ..."
Finding a point in the relative interior of a polyhedron
, 2007
"... A new initialization or ‘Phase I ’ strategy for feasible interior point methods for linear programming is proposed that computes a point on the primaldual central path associated with the linear program. Provided there exist primaldual strictly feasible points — an allpervasive assumption in inte ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
(Show Context)
A new initialization or ‘Phase I ’ strategy for feasible interior point methods for linear programming is proposed that computes a point on the primaldual central path associated with the linear program. Provided there exist primaldual strictly feasible points — an allpervasive assumption in interior point method theory that implies the existence of the central path — our initial method (Algorithm 1) is globally Qlinearly and asymptotically Qquadratically convergent, with a provable worstcase iteration complexity bound. When this assumption is not met, the numerical behaviour of Algorithm 1 is highly disappointing, even when the problem is primaldual feasible. This is due to the presence of implicit equalities, inequality constraints that hold as equalities at all the feasible points. Controlled perturbations of the inequality constraints of the primaldual problems are introduced — geometrically equivalent to enlarging the primaldual feasible region and then systematically contracting it back to its initial shape — in order for the perturbed problems to satisfy the assumption. Thus Algorithm 1 can successfully be employed to solve each of the perturbed problems. We show that, when there exist primaldual strictly feasible points of the original problems, the resulting method, Algorithm 2, finds such a point in a finite number of changes to the perturbation parameters. When implicit equalities are present, but the original problem and its dual are feasible, Algorithm 2 asymptotically detects all the primaldual implicit equalities and generates a point in the relative interior of the primaldual feasible set. Algorithm 2 can also asymptotically detect primaldual infeasibility. Successful numerical experience with Algorithm 2 on linear programs from NETLIB and CUTEr, both with and without any significant preprocessing of the problems, indicates that Algorithm 2 may be used as an algorithmic preprocessor for removing implicit equalities, with theoretical guarantees of convergence. 1
Infeasible Start Semidefinite Programming Algorithms Via SelfDual Embeddings
, 1997
"... The development of algorithms for semidefinite programming is an active research area, based on extensions of interior point methods for linear programming. As semidefinite programming duality theory is weaker than that of linear programming, only partial information can be obtained in some cases of ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
The development of algorithms for semidefinite programming is an active research area, based on extensions of interior point methods for linear programming. As semidefinite programming duality theory is weaker than that of linear programming, only partial information can be obtained in some cases of infeasibility, nonzero optimal duality gaps, etc. Infeasible start algorithms have been proposed which yield different kinds of information about the solution. In this paper a comprehensive treatment of a specific initialization strategy is presented, namely selfdual embedding, where the original primal and dual problems are embedded in a larger problem with a known interior feasible starting point. A framework for infeasible start algorithms with the best obtainable complexity bound is thus presented. The information that can be obtained in case of infeasibility, unboundedness, etc., is stated clearly. Important unresolved issues are discussed.
Convex Experimental Design Using Manifold Structure for Image Retrieval
"... Content Based Image Retrieval (CBIR) has become one of the most active research areas in computer science. Relevance feedback is often used in CBIR systems to bridge the semantic gap. Typically, users are asked to make relevance judgements on some query results, and the feedback information is then ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
(Show Context)
Content Based Image Retrieval (CBIR) has become one of the most active research areas in computer science. Relevance feedback is often used in CBIR systems to bridge the semantic gap. Typically, users are asked to make relevance judgements on some query results, and the feedback information is then used to rerank the images in the database. An effective relevance feedback algorithm must provide the users with the most informative images with respect to the ranking function. In this paper, we propose a novel active learning algorithm, called Convex Laplacian Regularized Ioptimal Design (CLapRID), for relevance feedback image retrieval. Our algorithm is based on a regression model which minimizes the least square error on the labeled images and simultaneously preserves the intrinsic geometrical structure of the image space. It selects the most informative images which minimize the average predictive variance. The optimization problem of CLapRID can be cast as a semidefinite programming (SDP) problem, and solved via interiorpoint methods. Experimental results on COREL database have demonstrate the effectiveness of the proposed algorithm for relevance feedback image retrieval. Categories and Subject Descriptors H.3.3 [Information storage and retrieval]: Information search and retrieval—Relevance feedback; G.3 [Mathematics of Computing]: Probability and Statistics—Experimental design
Maxmin fairness in multicommodity flows
 Computers & Operations Research
"... In this paper, we provide a study of MaxMin Fair (MMF) multicommodity flows and focus on some of their applications to multicommodity networks. We first present the theoretical background for the problem of MMF and recall its relations with lexicographic optimization as well as a polynomial appro ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
In this paper, we provide a study of MaxMin Fair (MMF) multicommodity flows and focus on some of their applications to multicommodity networks. We first present the theoretical background for the problem of MMF and recall its relations with lexicographic optimization as well as a polynomial approach for achieving leximin maximization. We next describe two applications to telecommunication networks, one on routing and the second on loadbalancing. We provide some deeper theoretical analysis of MMF multicommodity flows, show how to solve the lexicographically minimum load network problem for the link load functions most frequently used in telecommunication networks. Some computational results illustrate the behavior of the obtained solutions and the required CPU time for a range of random and welldimensioned networks. 1