Results 1 
6 of
6
An Inexact Hybrid Generalized Proximal Point Algorithm And Some New Results On The Theory Of Bregman Functions
 Mathematics of Operations Research
, 2000
"... We present a new Bregmanfunctionbased algorithm which is a modification of the generalized proximal point method for solving the variational inequality problem with a maximal monotone operator. The principal advantage of the presented algorithm is that it allows a more constructive error tolerance ..."
Abstract

Cited by 42 (10 self)
 Add to MetaCart
(Show Context)
We present a new Bregmanfunctionbased algorithm which is a modification of the generalized proximal point method for solving the variational inequality problem with a maximal monotone operator. The principal advantage of the presented algorithm is that it allows a more constructive error tolerance criterion in solving the proximal point subproblems. Furthermore, we eliminate the assumption of pseudomonotonicity which was, until now, standard in proving convergence for paramonotone operators. Thus we obtain a convergence result which is new even for exact generalized proximal point methods. Finally, we present some new results on the theory of Bregman functions. For example, we show that the standard assumption of convergence consistency is a consequence of the other properties of Bregman functions, and is therefore superfluous.
A Globally Convergent Inexact Newton Method for Systems of Monotone Equations
, 1998
"... We propose an algorithm for solving systems of monotone equations which combines Newton, proximal point, and projection methodologies. An important property of the algorithm is that the whole sequence of iterates is always globally convergent to a solution of the system without any additional regula ..."
Abstract

Cited by 24 (14 self)
 Add to MetaCart
We propose an algorithm for solving systems of monotone equations which combines Newton, proximal point, and projection methodologies. An important property of the algorithm is that the whole sequence of iterates is always globally convergent to a solution of the system without any additional regularity assumptions. Moreover, under standard assumptions the local superlinear rate of convergence is achieved. As opposed to classical globalization strategies for Newton methods, for computing the stepsize we do not use linesearch aimed at decreasing the value of some merit function. Instead, linesearch in the approximate Newton direction is used to construct an appropriate hyperplane which separates the current iterate from the solution set. This step is followed by projecting the current iterate onto this hyperplane, which ensures global convergence of the algorithm. Computational cost of each iteration of our method is of the same order as that of the classical damped Newton method. The c...
A globally convergent BFGS method for nonlinear monotone equations
 Mathematics of Computation
"... Abstract. Since 1965, there has been significant progress in the theoretical study on quasiNewton methods for solving nonlinear equations, especially in the local convergence analysis. However, the study on global convergence of quasiNewton methods is relatively fewer, especially for the BFGS meth ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract. Since 1965, there has been significant progress in the theoretical study on quasiNewton methods for solving nonlinear equations, especially in the local convergence analysis. However, the study on global convergence of quasiNewton methods is relatively fewer, especially for the BFGS method. To ensure global convergence, some merit function such as the squared norm merit function is typically used. In this paper, we propose an algorithm for solving nonlinear monotone equations, which combines the BFGS method and the hyperplane projection method. We also prove that the proposed BFGS method converges globally if the equation is monotone and Lipschitz continuous without differentiability requirement on the equation, which makes it possible to solve some nonsmooth equations. An attractive property of the proposed method is that its global convergence is independent of any merit function.We also report some numerical results to show efficiency of the proposed method. 1.
Printed in U.S.A. AN INEXACT HYBRIDGENERALIZEDPROXIMAL POINT ALGORITHM ANDSOME NEW RESULTS ON THE THEORY OF BREGMAN FUNCTIONS
"... We present a new Bregmanfunctionbased algorithm which is a modi cation of the generalized proximal point method for solving the variational inequality problem with a maximal monotone operator. The principal advantage of the presented algorithm is that it allows a more constructive error tolerance ..."
Abstract
 Add to MetaCart
We present a new Bregmanfunctionbased algorithm which is a modi cation of the generalized proximal point method for solving the variational inequality problem with a maximal monotone operator. The principal advantage of the presented algorithm is that it allows a more constructive error tolerance criterion in solving the proximal point subproblems. Furthermore, we eliminate the assumption of pseudomonotonicity which was, until now, standard in proving convergence for paramonotone operators. Thus we obtain a convergence result which is new even for exact generalized proximal point methods. Finally, we present some new results on the theory of Bregman functions. For example, we show that the standard assumption of convergence consistency is a consequence of the other properties of Bregman functions, and is therefore super uous.
A Modified LiuStorey Conjugate Gradient Projection Algorithm for Nonlinear Monotone Equations
"... Copyright c © 2014 Yaping Hu and Zengxin Wei. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In this paper, a modified LiuStorey ( ..."
Abstract
 Add to MetaCart
(Show Context)
Copyright c © 2014 Yaping Hu and Zengxin Wei. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. In this paper, a modified LiuStorey (LS) conjugate gradient projection algorithm is proposed for solving nonlinear monotone equations based on a hyperplane projection technique. The proposed method is a derivativefree method and can be applied to solving largescale nonsmooth equations for its lower storage requirement. We can establish its global convergence results under some suitable conditions. Numerical results show that this algorithm is efficient and promising.
SOME GLOBAL CONVERGENCE PROPERTIES OF THE LEVENBERGMARQUARDT METHODS WITH LINE SEARCH†
"... Abstract. In this paper, we consider two kinds of the LevenbergMarquardt method for solve a system of nonlinear equations. We use line search on every iteration to guarantee that the LevenbergMarquardt methods are globally convergent. Under mild conditions, we prove that while the descent conditi ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. In this paper, we consider two kinds of the LevenbergMarquardt method for solve a system of nonlinear equations. We use line search on every iteration to guarantee that the LevenbergMarquardt methods are globally convergent. Under mild conditions, we prove that while the descent condition can be satised at the iteration of the LevenbergMarquardt method, the global convergence of the method can be established.