Results 1  10
of
101,071
NonMonotone TrustRegion Methods for BoundConstrained Semismooth Equations with Applications to Nonlinear Mixed Complementarity Problems
, 1999
"... We develop and analyze a class of trustregion methods for boundconstrained semismooth systems of equations. The algorithm is based on a simply constrained differentiable minimization reformulation. Our global convergence results are developed in a very general setting that allows for nonmonotoni ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
monotonicity of the function values at subsequent iterates. We propose a way of computing trial steps by a semismooth Newtonlike method that is augmented by a projection onto the feasible set. Under a DennisMoretype condition we prove that close to a BDregular solution the trustregion algorithm turns into this projected
On Newtonlike Methods
 Numerische Mathematik, 11:324 – 330
, 1968
"... and complementary methods for largescale structural analysis of mammalian chromatin ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
and complementary methods for largescale structural analysis of mammalian chromatin
Tame functions are semismooth
 hal00777707, version 1  17
, 2013
"... Abstract Superlinear convergence of the Newton method for nonsmooth equations requires a “semismoothness ” assumption. In this work we prove that locally Lipschitz functions definable in an ominimal structure (in particular semialgebraic or globally subanalytic functions) are semismooth. Semialgebr ..."
Abstract

Cited by 16 (7 self)
 Add to MetaCart
. Semialgebraic, or more generally, globally subanalytic mappings present the special interest of being γorder semismooth, where γ is a positive parameter. As an application of this new estimate, we show that the error at the kstep of the Newton method behaves like
Semismooth Newton methods for operator equations in function spaces
, 2000
"... We develop a semismoothness concept for nonsmooth superposition operators in function spaces. The considered class of operators includes NCPfunctionbased reformulations of infinitedimensional nonlinear complementarity problems, and thus covers a very comprehensive class of applications. Our resul ..."
Abstract

Cited by 50 (3 self)
 Add to MetaCart
these semismoothness results to develop a Newtonlike method for nonsmooth operator equations and prove its local qsuperlinear convergence to regular solutions. If the underlying operator is fforder semismoothness, convergence of qorder 1 + ff is proved. We also establish the semismoothness of composite operators
SEMISMOOTH METHODS FOR LINEAR AND NONLINEAR
, 2006
"... The optimality conditions of a nonlinear secondorder cone program can be reformulated as a nonsmooth system of equations using a projection mapping. This allows the application of nonsmooth Newton methods for the solution of the nonlinear secondorder cone program. Conditions for the local quadra ..."
Abstract
 Add to MetaCart
: Linear secondorder cone program, nonlinear secondorder cone program, semismooth function, nonsmooth Newton method, quadratic convergence
TAME MAPPINGS ARE SEMISMOOTH
"... Dedicated to Stephen Robinson, who has so many of the best ideas first. Abstract Superlinear convergence of the Newton method for nonsmooth equations requires a “semismoothness ” assumption. In this work we prove that locally Lipschitz functions definable in an ominimal structure (in particular sem ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
of the Newton method behaves like O(2−(1+γ)k). Key words Semismoothness, semialgebraic function, ominimal structure, nonsmooth Newton method, structured optimization problem, superlinear convergence.
NewtonLike Methods for Sparse Inverse Covariance Estimation
, 2012
"... We propose two classes of secondorder optimization methods for solving the sparse inverse covariance estimation problem. The first approach, which we call the NewtonLASSO method, minimizes a piecewise quadratic model of the objective function at every iteration to generate a step. We employ the fa ..."
Abstract

Cited by 23 (3 self)
 Add to MetaCart
We propose two classes of secondorder optimization methods for solving the sparse inverse covariance estimation problem. The first approach, which we call the NewtonLASSO method, minimizes a piecewise quadratic model of the objective function at every iteration to generate a step. We employ
The Inexact NewtonLike Method for Inverse Eigenvalue Problem
 BIT
, 2001
"... In this paper, we consider using the inexact Newtonlike method for solving the inverse eigenvalue problem. This method can minimize the oversolving problem of Newtonlike methods and hence improve the efficiency. We give the convergence analysis of the method, and provide numerical tests to illustr ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
In this paper, we consider using the inexact Newtonlike method for solving the inverse eigenvalue problem. This method can minimize the oversolving problem of Newtonlike methods and hence improve the efficiency. We give the convergence analysis of the method, and provide numerical tests
A NewtonLike Method for Convex Functions
"... A Newtonlike method for convex functions is derived. It is shown that this method can be better than the Newton method. Especially good results can be obtained if we combine these two methods. Illustrative numerical examples are given. ..."
Abstract
 Add to MetaCart
A Newtonlike method for convex functions is derived. It is shown that this method can be better than the Newton method. Especially good results can be obtained if we combine these two methods. Illustrative numerical examples are given.
LOCAL CONVERGENCE ANALYSIS OF INEXACT NEWTON–LIKE METHODS
"... Abstract. We provide a local convergence analysis of inexact Newton–like methods in a Banach space setting under flexible majorant conditions. By introducing center–Lipschitz–type condition, we provide (under the same computational cost) a convergence analysis with the following advantages over earl ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract. We provide a local convergence analysis of inexact Newton–like methods in a Banach space setting under flexible majorant conditions. By introducing center–Lipschitz–type condition, we provide (under the same computational cost) a convergence analysis with the following advantages over
Results 1  10
of
101,071