Results 1 
9 of
9
A new active set algorithm for box constrained optimization,”
 SIAM Journal on Optimization,
, 2006
"... ..."
(Show Context)
An interior point Newtonlike method for nonnegative least squares problems with degenerate solution
 Numer. Linear Algebra Appl
"... Abstract. An interior point approach for medium and large nonnegative linear leastsquares problems is proposed. Global and locally quadratic convergence is shown even if a degenerate solution is approached. Viable approaches for implementation are discussed and numerical results are provided. ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
Abstract. An interior point approach for medium and large nonnegative linear leastsquares problems is proposed. Global and locally quadratic convergence is shown even if a degenerate solution is approached. Viable approaches for implementation are discussed and numerical results are provided.
A Reduced Newton Method for Constrained Linear LeastSquares Problems
"... We propose an iterative method that solves constrained linear leastsquares problems by formulating them as nonlinear systems of equations and applying the Newton scheme. The method reduces the size of the linear system to be solved at each iteration by considering only a subset of the unknown varia ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
We propose an iterative method that solves constrained linear leastsquares problems by formulating them as nonlinear systems of equations and applying the Newton scheme. The method reduces the size of the linear system to be solved at each iteration by considering only a subset of the unknown variables. Hence the linear system can be solved more efficiently. We prove that the method is locally quadratic convergent. Applications to image deblurring problems show that our method gives better restored images than those obtained by projecting or scaling the solution into the dynamic range.
Projected filter trust region methods for a semismooth leastsquares formulation of mixed complementarity problems
 Optim. Method Soft
"... Abstract: A reformulation of the mixed complementarity problem as a box constrained overdetermined system of semismooth equations or, equivalently, a box constrained nonlinear least squares problem with zero residual is presented. Based on this reformulation, a trust region method for the solution ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
Abstract: A reformulation of the mixed complementarity problem as a box constrained overdetermined system of semismooth equations or, equivalently, a box constrained nonlinear least squares problem with zero residual is presented. Based on this reformulation, a trust region method for the solution of mixed complementarity problems is considered. This trust region method contains elements from different areas: A projected LevenbergMarquardt step in order to guarantee local fast convergence under suitable assumptions, affine scaling matrices which are used to improve the global convergence properties, and a multidimensional filter technique to accept a full step more frequently. Global convergence results as well as local superlinear/quadratic convergence is shown under appropriate assumptions. Moreover, numerical results for the MCPLIB indicate that the overall method is quite robust.
An interiorpoint affinescaling trustregion method for semismooth equations with box constraints
 Comput. Optim. Appl
"... Abstract. An algorithm for the solution of a semismooth system of equations with box constraints is described. The method is an affinescaling trustregion method. All iterates generated by this method are strictly feasible. In this way, possible domain violations outside or on the boundary of the b ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Abstract. An algorithm for the solution of a semismooth system of equations with box constraints is described. The method is an affinescaling trustregion method. All iterates generated by this method are strictly feasible. In this way, possible domain violations outside or on the boundary of the box are avoided. The method is shown to have strong global and local convergence properties under suitable assumptions, in particular, when the method is used with a special scaling matrix. Numerical results are presented for a number of problems arising from different areas. Key Words. Affine scaling, trustregion method, nonlinear equations, box constraints, semismooth functions, Newton’s method. 1
ModelConstrained Optimization methods for . . .
, 2007
"... Most model reduction techniques employ a projection framework that utilizes a reducedspace basis. The basis is usually formed as the span of a set of solutions of the largescale system, which are computed for selected values (samples) of input parameters and forcing inputs. In existing model reduc ..."
Abstract
 Add to MetaCart
Most model reduction techniques employ a projection framework that utilizes a reducedspace basis. The basis is usually formed as the span of a set of solutions of the largescale system, which are computed for selected values (samples) of input parameters and forcing inputs. In existing model reduction techniques, choosing where and how many samples to generate has been, in general, an adhoc process. A key challenge is therefore how to systematically sample the input space, which is of high dimension for many applications of interest. This thesis proposes and analyzes a modelconstrained greedybased adaptive sampling approach in which the parametric input sampling problem is formulated as an optimization problem that targets an error estimation of reduced model output prediction. The method solves the optimization problem to find a locallyoptimal point
A SCALABLE ALGORITHM FOR MAP ESTIMATORS IN BAYESIAN INVERSE PROBLEMS WITH BESOV PRIORS
"... (Communicated by Jari Kaipio) Abstract. We present a scalable solver for approximating the maximum a posteriori (MAP) point of Bayesian inverse problems with Besov priors based on wavelet expansions with random coefficients. It is a subspace trust region interior reflective Newton conjugate gradient ..."
Abstract
 Add to MetaCart
(Show Context)
(Communicated by Jari Kaipio) Abstract. We present a scalable solver for approximating the maximum a posteriori (MAP) point of Bayesian inverse problems with Besov priors based on wavelet expansions with random coefficients. It is a subspace trust region interior reflective Newton conjugate gradient method for bound constrained optimization problems. The method combines the rapid locallyquadratic convergence rate properties of Newton’s method, the effectiveness of trust region globalization for treating illconditioned problems, and the Eisenstat–Walker idea of preventing oversolving. We demonstrate the scalability of the proposed method on two inverse problems: a deconvolution problem and a coefficient inverse problem governed by elliptic partial differential equations. The numerical results show that the number of Newton iterations is independent of the number of wavelet coefficients n and the computation time scales linearly in n. It will be numerically shown, under our implementations, that the proposed solver is two times faster than the split Bregman approach, and it is an order of magnitude less expensive than the interior path following primaldual method. Our results also confirm the fact that the Besov B111 prior is sparsity promoting, discretizationinvariant, and edgepreserving for both imaging and inverse problems governed by partial differential equations. 1. Introduction. Besov
Terms of use:
"... [Article] Constrained dogleg methods for nonlinear systems with simple bounds ..."
Abstract
 Add to MetaCart
(Show Context)
[Article] Constrained dogleg methods for nonlinear systems with simple bounds