Results 1  10
of
70
STABILIZED SEQUENTIAL QUADRATIC PROGRAMMING FOR OPTIMIZATION AND A STABILIZED NEWTONTYPE METHOD FOR VARIATIONAL PROBLEMS WITHOUT CONSTRAINT QUALIFICATIONS
, 2007
"... The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve fast convergence despite possible degeneracy of constraints of optimization problems, when the Lagrange multipliers associated to a solution are not unique. Superlinear convergence ..."
Abstract

Cited by 24 (14 self)
 Add to MetaCart
stronger superlinear convergence result than the above, assuming SOSC only. In addition, our analysis is carried out in the more general setting of variational problems, for which we introduce a natural extension of sSQP techniques. In the process, we also obtain a new error bound for Karush
COMBINING STABILIZED SQP WITH THE AUGMENTED LAGRANGIAN ALGORITHM∗
, 2014
"... For an optimization problem with general equality and inequality constraints, we propose an algorithm which uses subproblems of the stabilized SQP (sSQP) type for approximately solving subproblems of the augmented Lagrangian method. The motivation is to take advantage of the wellknown robust behav ..."
Abstract
 Add to MetaCart
For an optimization problem with general equality and inequality constraints, we propose an algorithm which uses subproblems of the stabilized SQP (sSQP) type for approximately solving subproblems of the augmented Lagrangian method. The motivation is to take advantage of the wellknown robust
Stabilized SQP revisited
, 2010
"... The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve superlinear convergence in situations when the Lagrange multipliers associated to a solution are not unique. Within the framework of [11], the key to local superlinear convergence o ..."
Abstract
 Add to MetaCart
The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve superlinear convergence in situations when the Lagrange multipliers associated to a solution are not unique. Within the framework of [11], the key to local superlinear convergence
Stabilized SQP revisited
 MATH. PROGRAM., SER. A
, 2010
"... The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve superlinear convergence in situations when the Lagrange multipliers associated to a solution are not unique. Within the framework of Fischer (Math Program 94:91–124, 2002), the key ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
The stabilized version of the sequential quadratic programming algorithm (sSQP) had been developed in order to achieve superlinear convergence in situations when the Lagrange multipliers associated to a solution are not unique. Within the framework of Fischer (Math Program 94:91–124, 2002), the key
GLOBALIZING STABILIZED SQP BY SMOOTH PRIMALDUAL EXACT PENALTY FUNCTION∗
, 2014
"... An iteration of the stabilized sequential quadratic programming method (sSQP) consists in solving a certain quadratic program in the primaldual space, regularized in the dual variables. The advantage with respect to the classical sequential quadratic programming (SQP) is that no constraint qualific ..."
Abstract
 Add to MetaCart
An iteration of the stabilized sequential quadratic programming method (sSQP) consists in solving a certain quadratic program in the primaldual space, regularized in the dual variables. The advantage with respect to the classical sequential quadratic programming (SQP) is that no constraint
Modifying SQP for degenerate problems
 Preprint ANL/MCSP6991097, Mathematics and Computer Science Division, Argonne National Laboratory, Argonne, Ill
, 1997
"... Abstract. Most local convergence analyses of the sequential quadratic programming (SQP) algorithm for nonlinear programming make strong assumptions about the solution, namely, that the active constraint gradients are linearly independent and that there are no weakly active constraints. In this paper ..."
Abstract

Cited by 39 (6 self)
 Add to MetaCart
of SQP often continue to exhibit good local convergence behavior even when the assumptions commonly made in the analysis are violated. Finally, we describe a new algorithm that formalizes and extends standard SQP implementation techniques, and we prove convergence results for this method also. AMS
Constraint identification and algorithm stabilization for degenerate nonlinear programs
 Mathematical Programming
, 2003
"... Abstract. In the vicinity of a solution of a nonlinear programming problem at which both strict complementarity and linear independence of the active constraints may fail to hold, we describe a technique for distinguishing weakly active from strongly active constraints. We show that this information ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
Abstract. In the vicinity of a solution of a nonlinear programming problem at which both strict complementarity and linear independence of the active constraints may fail to hold, we describe a technique for distinguishing weakly active from strongly active constraints. We show
Farsighted active learning on a budget for image and video recognition
 In CVPR
, 2010
"... Active learning methods aim to select the most informative unlabeled instances to label first, and can help to focus image or video annotations on the examples that will most improve a recognition system. However, most existing methods only make myopic queries for a single label at a time, retrainin ..."
Abstract

Cited by 28 (2 self)
 Add to MetaCart
Active learning methods aim to select the most informative unlabeled instances to label first, and can help to focus image or video annotations on the examples that will most improve a recognition system. However, most existing methods only make myopic queries for a single label at a time, retraining at each iteration. We consider the problem where at each iteration the active learner must select a set of examples meeting a given budget of supervision, where the budget is determined by the funds (or time) available to spend on annotation. We formulate the budgeted selection task as a continuous optimization problem where we determine which subset of possible queries should maximize the improvement to the classifier’s objective, without overspending the budget. To ensure farsighted batch requests, we show how to incorporate the predicted change in the model that the candidate examples will induce. We demonstrate the proposed algorithm on three datasets for object recognition, activity recognition, and contentbased retrieval, and we show its clear practical advantages over random, myopic, and batch selection baselines. 1.
ffflfi
"... l?R7yQyQ`hg R qY6RSsŁXANal"XlD]_q"R7NQ_AWnl?R7qY`Dd]SsNiRyv7RSsNiR7qYNQ]_AW]ufk_AdyQNQWsXya`~`Dwj`DWhbffeA`flqY]qpX*` xASs`hv7RyQ`h_AlD`k]7u*lhSY]WsWYyiR7_Ade4Rd`l?R7qY`Dd]SYNiR7y7v7RSYNQR7qpNa]_UNa_weAytqpNQyaNQ_Ade4Ry9RxAx*yQNQlDR7qpNa]_AWhZ?]eASlDR7qp`hd]SsNiRy v7RSYNQR7qpNa]_Ss`DWs]eAS ..."
Abstract
 Add to MetaCart
[ XANaW`hv7Ryae4R7qpNa]_fSY`hv`?RyaW qpXAR7q qpX*`l?R7qY`Dd]SsNiRy;g4R7qpRV4RWs`Rl"XANa`hv`DWRXANQdX>gA`DdSs`D`]uJx*SY`DlhNQWsNQ]_R_*g>SY`DlDRyQyKUg*gANaqYNQ]_4R7yQyarZ Ł`UgA`hwn]_*WsqpSYR7qp ` qpXAR7qqYXA`g4R7q"R7V4RWY`ONQwjxASY]#v`DW6]_nqYXA`OyaNQ_Az RV*NQyQNtq r]7uB]Ssqp`hS qY`Dwjwn`hSVrj]#v`hS
INEXACT JOSEPHY–NEWTON FRAMEWORK FOR GENERERALIZED EQUATIONS AND ITS APPLICATIONS TO LOCAL ANALYSIS OF NEWTONIAN METHODS FOR CONSTRAINED OPTIMIZATION ∗
, 2008
"... We propose and analyze a perturbed version of the classical JosephyNewton method for solving generalized equations. This perturbed framework is convenient to treat in a unified way standard sequential quadratic programming, its stabilzed version, sequential quadratically constrained quadratic progr ..."
Abstract

Cited by 12 (7 self)
 Add to MetaCart
We propose and analyze a perturbed version of the classical JosephyNewton method for solving generalized equations. This perturbed framework is convenient to treat in a unified way standard sequential quadratic programming, its stabilzed version, sequential quadratically constrained quadratic programming, and linearly constrained Lagrangian methods. For the linearly constrained Lagrangian methods, in particular, we obtain superlinear convergence under the secondorder sufficient optimality condition and the strict Mangasarian–Fromovitz constraint qualification, while previous results in the literature assume (in addition to secondorder sufficiency) the stronger linear independence constraint qualification as well as the strict complementarity condition. For the sequential quadratically constrained quadratic programming methods, we prove primaldual superlinear/quadratic convergence under the same assumptions as above, which also gives a new result.
Results 1  10
of
70