Results 1  10
of
168,343
On the DSM Newtontype method
"... A wide class of the operator equations F (u) = h in a Hilbert space is studied. Convergence of a Dynamical Systems Method (DSM), based on the continuous analog of the Newton method, is proved without any smoothness assumptions on the F ′ (u). It is assumed that F ′ (u) depends on u continuously. Ex ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
A wide class of the operator equations F (u) = h in a Hilbert space is studied. Convergence of a Dynamical Systems Method (DSM), based on the continuous analog of the Newton method, is proved without any smoothness assumptions on the F ′ (u). It is assumed that F ′ (u) depends on u continuously
NewtonType Methods For Stochastic Programming
 Mathematical and Computer Modelling
"... Stochastic programming is concerned with practical procedures for decisionmaking under uncertainty, by modelling uncertainties and risks associated with decisions in a form suitable for optimization. The field is developing rapidly with contributions from many disciplines such as operations researc ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
integral. Moreover, the objective function is possibly nondifferentiable. This paper provides a brief overview of recent developments on smooth approximation techniques and Newtontype methods for solving twostage stochastic linear programs with recourse, and parallel implementation of these methods. A
PROXIMAL NEWTONTYPE METHODS FOR MINIMIZING COMPOSITE FUNCTIONS
"... Abstract. We generalize Newtontype methods for minimizing smooth functions to handle a sum of two convex functions: a smooth function and a nonsmooth function with a simple proximal mapping. We show that the resulting proximal Newtontype methods inherit the desirable convergence behavior of Newton ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Abstract. We generalize Newtontype methods for minimizing smooth functions to handle a sum of two convex functions: a smooth function and a nonsmooth function with a simple proximal mapping. We show that the resulting proximal Newtontype methods inherit the desirable convergence behavior
Convergence analysis of inexact proximal Newtontype methods
"... We study inexact proximal Newtontype methods to solve convex optimization problems in composite form: minimize x∈Rn f(x): = g(x) + h(x), where g is convex and continuously differentiable and h: Rn → R is a convex but not necessarily differentiable function whose proximal mapping can be evaluated ef ..."
Abstract
 Add to MetaCart
We study inexact proximal Newtontype methods to solve convex optimization problems in composite form: minimize x∈Rn f(x): = g(x) + h(x), where g is convex and continuously differentiable and h: Rn → R is a convex but not necessarily differentiable function whose proximal mapping can be evaluated
Projected Newtontype Methods in Machine Learning
"... We consider projected Newtontype methods for solving largescale optimization problems arising in machine learning and related fields. We first introduce an algorithmic framework for projected Newtontype methods by reviewing a canonical projected (quasi)Newton method. This method, while conceptua ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
We consider projected Newtontype methods for solving largescale optimization problems arising in machine learning and related fields. We first introduce an algorithmic framework for projected Newtontype methods by reviewing a canonical projected (quasi)Newton method. This method, while
11 Projected Newtontype Methods in Machine Learning
"... We consider projected Newtontype methods for solving largescale optimization problems arising in machine learning and related fields. We first introduce an algorithmic framework for projected Newtontype methods by reviewing a canonical projected (quasi)Newton method. This method, while concep ..."
Abstract
 Add to MetaCart
We consider projected Newtontype methods for solving largescale optimization problems arising in machine learning and related fields. We first introduce an algorithmic framework for projected Newtontype methods by reviewing a canonical projected (quasi)Newton method. This method, while
Proximal Newtontype methods for convex optimization
"... We seek to solve convex optimization problems in composite form: minimize x∈R n f(x): = g(x) + h(x), where g is convex and continuously differentiable and h: R n → R is a convex but not necessarily differentiable function whose proximal mapping can be evaluated efficiently. We derive a generalizatio ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
generalization of Newtontype methods to handle such convex but nonsmooth objective functions. We prove such methods are globally convergent and achieve superlinear rates of convergence in the vicinity of an optimal solution. We also demonstrate the performance of these methods using problems of relevance
A continuous Newtontype method for unconstrained optimization
"... In this paper, we propose a continuous Newtontype method in the form of an ordinary differential equation by combining the negative gradient and Newton’s direction. It is shown that for a general function f(x), our method converges globally to a connected subset of the stationary points of f(x) und ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
In this paper, we propose a continuous Newtontype method in the form of an ordinary differential equation by combining the negative gradient and Newton’s direction. It is shown that for a general function f(x), our method converges globally to a connected subset of the stationary points of f
NewtonType Methods With Generalized Distances For Constrained Optimization
, 1997
"... We consider a class of interior point algorithms for minimizing a twice continuously differentiable function over a closed convex set with nonempty interior. On one hand, our algorithms can be viewed as an approximate version of the generalized proximal point methods and, on the other hand, as an ex ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
, as an extension of unconstrained Newtontype methods to the constrained case. Each step consists of solving a strongly convex unconstrained program followed by a onedimensional search along either a line or a curve segment in the interior of the feasible set. The information about the feasible set is contained
Fast Newtontype Methods for Total Variation Regularization
"... Numerous applications in statistics, signal processing, and machine learning regularize using Total Variation (TV) penalties. We study anisotropic (`1based) TV and also a related `2norm variant. We consider for both variants associated (1D) proximity operators, which lead to challenging optimiz ..."
Abstract
 Add to MetaCart
optimization problems. We solve these problems by developing Newtontype methods that outperform the stateoftheart algorithms. More importantly, our 1DTV algorithms serve as building blocks for solving the harder task of computing 2 (and higher)dimensional TV proximity. We illustrate the computational
Results 1  10
of
168,343