Results 1  10
of
11
Necessary and sufficient optimality conditions for mathematical programs with equilibrium constraints
, 2005
"... ..."
N.M.: Variational stability and marginal functions via generalized differentiation
, 2005
"... Robust Lipschitzian properties of setvalued mappings and marginal functions play a crucial role in many aspects of variati01~al analysis and its applications, especially for issues related to variational stability and optimizatiou. \Ve develop an approach to variational stability based on generaliz ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
(Show Context)
Robust Lipschitzian properties of setvalued mappings and marginal functions play a crucial role in many aspects of variati01~al analysis and its applications, especially for issues related to variational stability and optimizatiou. \Ve develop an approach to variational stability based on generalized differentiation. The principal achievements of this paper include new results on coderivative calculus for setvalued mappings and singular subdifferentials of marginal functions in infinite dimensions with their extended applications to Lipschitz.ian stability. In this way we derive efficient conditions ensuring the preservation of Lipschitzian and related properties for setvalued mappings under various operations, with the exact bound/modulus estimates, as well as new sufficient conditions for the · Lipschitz continuity of marginal functions. Key words: optimization and variational, ro,bust stability and sensitivity, marginal and value functions, generalized differentiatiou
Stochastic Multiobjective Optimization: Sample Average Approximation and Applications
 J OPTIM THEORY APPL
, 2011
"... We investigate one stage stochastic multiobjective optimization problems where the objectives are the expected values of random functions. Assuming that the closed form of the expected values is difficult to obtain, we apply the well known Sample Average Approximation (SAA) method to solve it. We ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We investigate one stage stochastic multiobjective optimization problems where the objectives are the expected values of random functions. Assuming that the closed form of the expected values is difficult to obtain, we apply the well known Sample Average Approximation (SAA) method to solve it. We propose a smoothing infinity norm scalarization approach to solve the SAA problem and analyse the convergence of efficient solution of the SAA problem to the original problem as sample sizes increase. Under some moderate conditions, we show that, with probability approaching one exponentially fast with the increase of sample size, an optimal solution to the SAA problem becomes an optimal solution to its true counterpart. Moreover, under second order growth conditions, we show that an efficient point of the smoothed problem approximates an efficient solution of the true problem at a linear rate. Finally, we describe some numerical experiments on some stochastic multiobjective optimization problems and report preliminary results.
Approximating Stationary Points of Stochastic Mathematical Programs with Equilibrium Constraints via Sample Averaging
, 2011
"... We investigate sample average approximation of a general class of onestage stochastic mathematical programs with equilibrium constraints. By using graphical convergence of unbounded setvalued mappings, we demonstrate almost sure convergence of a sequence of stationary points of sample average appro ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
We investigate sample average approximation of a general class of onestage stochastic mathematical programs with equilibrium constraints. By using graphical convergence of unbounded setvalued mappings, we demonstrate almost sure convergence of a sequence of stationary points of sample average approximation problems to their true counterparts as the sample size increases. In particular we show the convergence of M(Mordukhovich)stationary point and C(Clarke)stationary point of the sample average approximation problem to those of the true problem. The research complements the existing work in the literature by considering a general equilibrium constraint to be represented by a stochastic generalized equation and exploiting graphical convergence of coderivative mappings.
On the robustness of global optima and stationary solutions to stochastic mathematical programs with equilibrium constraints
, 2009
"... ..."
(Show Context)
METHODS OF VARIATIONAL ANALYSIS IN MULTIOBJECTIVE OPTIMIZATION 1
"... The paper concerns new applications of advanced methods of variational analysis and generalized differentiation to constrained problems of multiobjective/vector optimization. We pay the main attention to general notions of optimal solutions for multiobjective problems that are induced by geometric c ..."
Abstract
 Add to MetaCart
The paper concerns new applications of advanced methods of variational analysis and generalized differentiation to constrained problems of multiobjective/vector optimization. We pay the main attention to general notions of optimal solutions for multiobjective problems that are induced by geometric concepts. of extremality in variational analysis while covering various notions of Pareto and other type of optimality/efficiency conventional in multiobjective optimization. Based on the extremal principles in variational analysis and on appropriate tools of generalized differentiation with welldeveloped calculus rules, we derive necessary optimality conditions for broad classes of constrained multiobjective problems in the framework of infinitedimensional spaces. Applications of variational techniques in infinite dimensions require certain "normal compactness " properties of sets and setvalued mappings, which play a crucial rcile in deriving the main results of this paper.
Positive Definiteness of HighOrder Subdifferential and HighOrder Optimality Conditions in Vector Optimization Problems
"... We obtain a new Taylor's formula in terms of the + 1 order subdifferential of a ,1 function from to . As its applications in optimization problems, we build + 1 order sufficient optimality conditions of this kind of functions and + 1 order necessary conditions for strongly quasiconvex functio ..."
Abstract
 Add to MetaCart
We obtain a new Taylor's formula in terms of the + 1 order subdifferential of a ,1 function from to . As its applications in optimization problems, we build + 1 order sufficient optimality conditions of this kind of functions and + 1 order necessary conditions for strongly quasiconvex functions.
A UNIFIED SEPARATION THEOREM FOR CLOSED SETS IN A BANACH SPACE AND OPTIMALITY CONDITIONS FOR VECTOR OPTIMIZATION *
"... Abstract. Using the technique of variational analysis and in terms of normal cones, we establish unified separation results for finitely many closed (not necessarily convex) sets in Banach spaces, which not only cover the existing nonconvex separation results and a classical convex separation theore ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Using the technique of variational analysis and in terms of normal cones, we establish unified separation results for finitely many closed (not necessarily convex) sets in Banach spaces, which not only cover the existing nonconvex separation results and a classical convex separation theorem, but also recapture the approximate projection theorem. With help of the separation result for closed sets, we provide necessary and sufficient conditions for approximate Pareto solutions of constrained vector optimization problems. In particular, we extend some basic optimality results for approximate solutions of numerical optimization problems to the vector optimization setting.
constraints, part I: Theory
"... manuscript No. (will be inserted by the editor) On the robustness of global optima and stationary solutions ..."
Abstract
 Add to MetaCart
(Show Context)
manuscript No. (will be inserted by the editor) On the robustness of global optima and stationary solutions