Results 1  10
of
142
Convex Nondifferentiable Optimization: A Survey Focussed On The Analytic Center Cutting Plane Method.
, 1999
"... We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a selfcontained convergence analysis, that uses the formalism of the theory of selfconcordant functions, but for the main results, we give direct pr ..."
Abstract

Cited by 75 (3 self)
 Add to MetaCart
We present a survey of nondifferentiable optimization problems and methods with special focus on the analytic center cutting plane method. We propose a selfcontained convergence analysis, that uses the formalism of the theory of selfconcordant functions, but for the main results, we give direct proofs based on the properties of the logarithmic function. We also provide an in depth analysis of two extensions that are very relevant to practical problems: the case of multiple cuts and the case of deep cuts. We further examine extensions to problems including feasible sets partially described by an explicit barrier function, and to the case of nonlinear cuts. Finally, we review several implementation issues and discuss some applications.
Filter Pattern Search Algorithms for Mixed Variable Constrained Optimization Problems
 SIAM Journal on Optimization
, 2004
"... A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for gene ..."
Abstract

Cited by 55 (6 self)
 Add to MetaCart
(Show Context)
A new class of algorithms for solving nonlinearly constrained mixed variable optimization problems is presented. This class combines and extends the AudetDennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPSfilter algorithms for general nonlinear constraints. In generalizing existing algorithms, new theoretical convergence results are presented that reduce seamlessly to existing results for more specific classes of problems. While no local continuity or smoothness assumptions are required to apply the algorithm, a hierarchy of theoretical convergence results based on the Clarke calculus is given, in which local smoothness dictate what can be proved about certain limit points generated by the algorithm. To demonstrate the usefulness of the algorithm, the algorithm is applied to the design of a loadbearing thermal insulation system. We believe this is the first algorithm with provable convergence results to directly target this class of problems.
Complexity Analysis of an Interior Cutting Plane Method for Convex Feasibility Problems
"... We further analyze the convergence and the complexity of a dual column generation algorithm for solving general convex feasibility problems defined by a separation oracle. The oracle is called at an approximate analytic center of the set given by the intersection of the linear inequalities which a ..."
Abstract

Cited by 53 (12 self)
 Add to MetaCart
We further analyze the convergence and the complexity of a dual column generation algorithm for solving general convex feasibility problems defined by a separation oracle. The oracle is called at an approximate analytic center of the set given by the intersection of the linear inequalities which are the previous answers of the oracle. We show that the algorithm converges in finite time and is in fact a fully polynomial approximation algorithm, provided that the feasible region has an nonempty interior.
Variable Metric Bundle Methods: from Conceptual to Implementable Forms
, 1996
"... To minimize a convex function, we combine MoreauYosida regularizations, quasiNewton matrices and bundling mechanisms. First we develop conceptual forms using "reversal " quasiNewton formulae and we state their global and local convergence. Then, to produce implementable versions, ..."
Abstract

Cited by 52 (10 self)
 Add to MetaCart
To minimize a convex function, we combine MoreauYosida regularizations, quasiNewton matrices and bundling mechanisms. First we develop conceptual forms using &quot;reversal &quot; quasiNewton formulae and we state their global and local convergence. Then, to produce implementable versions, we incorporate a bundle strategy together with a "curvesearch". No convergence results are given for the implementable versions; however some numerical illustrations show their good behaviour even for largescale problems.
A Cutting Plane Method from Analytic Centers for Stochastic Programming
 Mathematical Programming
, 1994
"... The stochastic linear programming problem with recourse has a dual block angular structure. It can thus be handled by Benders decomposition or by Kelley's method of cutting planes; equivalently the dual problem has a primal block angular structure and can be handled by DantzigWolfe decompositi ..."
Abstract

Cited by 47 (17 self)
 Add to MetaCart
(Show Context)
The stochastic linear programming problem with recourse has a dual block angular structure. It can thus be handled by Benders decomposition or by Kelley's method of cutting planes; equivalently the dual problem has a primal block angular structure and can be handled by DantzigWolfe decomposition the two approaches are in fact identical by duality. Here we shall investigate the use of the method of cutting planes from analytic centers applied to similar formulations. The only significant difference form the aforementioned methods is that new cutting planes (or columns, by duality) will be generated not from the optimum of the linear programming relaxation, but from the analytic center of the set of localization. 1 Introduction The study of optimization problems in the presence of uncertainty still taxes the limits of methodology and software. One of the most approachable settings is that of twostaged planning under uncertainty, in which a first stage decision has to be taken bef...
Solving Nonlinear Multicommodity Flow Problems By The Analytic Center Cutting Plane Method
, 1995
"... The paper deals with nonlinear multicommodity flow problems with convex costs. A decomposition method is proposed to solve them. The approach applies a potential reduction algorithm to solve the master problem approximately and a column generation technique to define a sequence of primal linear prog ..."
Abstract

Cited by 43 (16 self)
 Add to MetaCart
The paper deals with nonlinear multicommodity flow problems with convex costs. A decomposition method is proposed to solve them. The approach applies a potential reduction algorithm to solve the master problem approximately and a column generation technique to define a sequence of primal linear programming problems. Each subproblem consists of finding a minimum cost flow between an origin and a destination node in an uncapacited network. It is thus formulated as a shortest path problem and solved with the Dijkstra's dheap algorithm. An implementation is described that that takes full advantage of the supersparsity of the network in the linear algebra operations. Computational results show the efficiency of this approach on wellknown nondifferentiable problems and also large scale randomly generated problems (up to 1000 arcs and 5000 commodities). This research has been supported by the Fonds National de la Recherche Scientifique Suisse, grant #12 \Gamma 34002:92, NSERCCanada and ...
Tighter and convex maximum margin clustering
 In AISTATS, 2009b
"... Maximum margin principle has been successfully applied to many supervised and semisupervised problems in machine learning. Recently, this principle was extended for clustering, referred to as Maximum Margin Clustering (MMC) and achieved promising performance in recent studies. To avoid the problem ..."
Abstract

Cited by 41 (14 self)
 Add to MetaCart
(Show Context)
Maximum margin principle has been successfully applied to many supervised and semisupervised problems in machine learning. Recently, this principle was extended for clustering, referred to as Maximum Margin Clustering (MMC) and achieved promising performance in recent studies. To avoid the problem of local minima, MMC can be solved globally via convex semidefinite programming (SDP) relaxation. Although many efficient approaches have been proposed to alleviate the computational burden of SDP, convex MMCs are still not scalable for medium data sets. In this paper, we propose a novel convex optimization method, LGMMC, which maximizes the margin of opposite clusters via “Label Generation”. It can be shown that LGMMC is much more scalable than existing convex approaches. Moreover, we show that our convex relaxation is tighter than stateofart convex MMCs. Experiments on seventeen UCI datasets and MNIST dataset show significant improvement over existing MMC algorithms. 1
Warm Start of the PrimalDual Method Applied in the CuttingPlane Scheme
 in the Cutting Plane Scheme, Mathematical Programming
, 1997
"... A practical warmstart procedure is described for the infeasible primaldual interiorpoint method employed to solve the restricted master problem within the cuttingplane method. In contrast to the theoretical developments in this field, the approach presented in this paper does not make the unreal ..."
Abstract

Cited by 37 (6 self)
 Add to MetaCart
A practical warmstart procedure is described for the infeasible primaldual interiorpoint method employed to solve the restricted master problem within the cuttingplane method. In contrast to the theoretical developments in this field, the approach presented in this paper does not make the unrealistic assumption that the new cuts are shallow. Moreover, it treats systematically the case when a large number of cuts are added at one time. The technique proposed in this paper has been implemented in the context of HOPDM, the state of the art, yet public domain, interiorpoint code. Numerical results confirm a high degree of efficiency of this approach: regardless of the number of cuts added at one time (can be thousands in the largest examples) and regardless of the depth of the new cuts, reoptimizations are usually done with a few additional iterations. Key words. Warm start, primaldual algorithm, cuttingplane methods. Supported by the Fonds National de la Recherche Scientifique Su...