Results 1  10
of
55
Optimization by Direct Search: New Perspectives on Some Classical and Modern Methods
 SIAM REVIEW VOL. 45, NO. 3, PP. 385–482
, 2003
"... Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because they lacked ..."
Abstract

Cited by 237 (15 self)
 Add to MetaCart
Direct search methods are best known as unconstrained optimization techniques that do not explicitly use derivatives. Direct search methods were formally proposed and widely applied in the 1960s but fell out of favor with the mathematical optimization community by the early 1970s because they lacked coherent mathematical analysis. Nonetheless, users remained loyal to these methods, most of which were easy to program, some of which were reliable. In the past fifteen years, these methods have seen a revival due, in part, to the appearance of mathematical analysis, as well as to interest in parallel and distributed computing. This review begins by briefly summarizing the history of direct search methods and considering the special properties of problems for which they are well suited. Our focus then turns to a broad class of methods for which we provide a unifying framework that lends itself to a variety of convergence results. The underlying principles allow generalization to handle bound constraints and linear constraints. We also discuss extensions to problems with nonlinear constraints.
Asynchronous parallel pattern search for nonlinear optimization
 SIAM J. Sci. Comput
, 2001
"... Asynchronous parallel pattern search (APPS) is a nonlinear optimization algorithm that dynamically initiates actions in response to events, rather than cycling through a fixed set of search directions, as is the case for synchronous pattern search. This gives us a versatile concurrent strategy that ..."
Abstract

Cited by 87 (11 self)
 Add to MetaCart
Asynchronous parallel pattern search (APPS) is a nonlinear optimization algorithm that dynamically initiates actions in response to events, rather than cycling through a fixed set of search directions, as is the case for synchronous pattern search. This gives us a versatile concurrent strategy that allows us to effectively balance the computational load across all available processors. However, the semiautonomous nature of the search complicates the analysis. We concentrate on elucidating the concepts and notation required to track the iterates produced by APPS across all participating processes. To do so, we consider APPS and its synchronous counterpart (PPS) applied to a simple problem. This allows us both to introduce the bookkeeping we found necessary for the analysis and to highlight some of the fundamental differences between APPS and PPS.
Derivativefree optimization: A review of algorithms and comparison of software implementations
"... ..."
Mixed Variable Optimization of the Number and Composition of Heat Intercepts in a Thermal Insulation System
, 2000
"... : In the literature, thermal insulation systems with a xed number of heat intercepts have been optimized with respect to intercept locations and temperatures. The number of intercepts and the types of insulators that surround them were chosen by parametric studies. This was because the optimization ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
: In the literature, thermal insulation systems with a xed number of heat intercepts have been optimized with respect to intercept locations and temperatures. The number of intercepts and the types of insulators that surround them were chosen by parametric studies. This was because the optimization methods used could not treat such categorical variables. Discrete optimization variables are categorical if the objective function or the constraints can not be evaluated unless the variables take one of a prescribed enumerable set of values. The key issue is that categorical variables can not be treated as ordinary discrete variables are treated by relaxing them to continuous variables with a side constraint that they be discrete at the solution. A new mixed variable programming (MVP) algorithm makes it possible to optimize directly with respect to mixtures of discrete, continuous, and categorical decision variables. The result of applying MVP is shown here to give a 65% reduction in the ...
Comparison of a generalized pattern search and a genetic algorithm optimization method
 Proc. 8 th International Building Performance Simulation Association Conference vol III
, 2003
"... ..."
Optimal Aeroacoustic Shape Design Using the Surrogate Management Framework
 Optimization and Engineering
, 2004
"... Shape optimization is applied to timedependent trailingedge flow in order to minimize aerodynamic noise. Optimization is performed using the surrogate management framework (SMF), a nongradient based pattern search method chosen for its e#ciency and rigorous convergence properties. Using SMF, d ..."
Abstract

Cited by 21 (6 self)
 Add to MetaCart
(Show Context)
Shape optimization is applied to timedependent trailingedge flow in order to minimize aerodynamic noise. Optimization is performed using the surrogate management framework (SMF), a nongradient based pattern search method chosen for its e#ciency and rigorous convergence properties. Using SMF, design space exploration is performed not with the expensive actual function but with an inexpensive surrogate function. The use of a polling step in the SMF guarantees that the algorithm generates a convergent subsequence of mesh points, each iterate of which is a local minimizer of the cost function on a mesh in the parameter space. Results are presented for an unsteady laminar flow past an acoustically compact airfoil. Constraints on lift and drag are handled within SMF by applying the filter pattern search method of Audet and Dennis, within which a penalty function is used to form and optimize a surrogate function.
2007) Scatter Search for chemical and bioprocess optimization
 Journal of Global Optimization
"... Scatter search is a populationbased method that has recently been shown to yield promising outcomes for solving combinatorial and nonlinear optimization problems. Based on formulations originally proposed in the 1960s for combining decision rules and problem constraints such as the surrogate constr ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
(Show Context)
Scatter search is a populationbased method that has recently been shown to yield promising outcomes for solving combinatorial and nonlinear optimization problems. Based on formulations originally proposed in the 1960s for combining decision rules and problem constraints such as the surrogate constraint method, scatter search uses strategies for combining solution vectors that have proved effective in a variety of problem settings. In this paper, we develop a general purpose heuristic for a class of nonlinear optimization problems. The procedure is based on the scatter search methodology and treats the objective function evaluation as a black box, making the search algorithm contextindependent. Most optimization problems in the chemical and biochemical industries are highly nonlinear in either the objective function or the constraints. Moreover, they usually present differentialalgebraic systems of constraints. In this type of problem, the evaluation of a solution or even the feasibility test of a set of values for the decision variables is a timeconsuming operation. In this context, the solution method is limited to a reduced number of solution examinations. We have implemented a scatter search procedure in Matlab for this special class of difficult optimization problems. Our development goes beyond a simple exercise of applying scatter search to this class of problem, but presents innovative mechanisms to obtain a good balance between intensification and diversification in a shortterm search horizon. Computational comparisons with other recent methods over a set of benchmark problems favor the proposed procedure.
Mixed variable optimization of a loadbearing thermal insulation system
, 2002
"... categorical variables, mixed variable programming, pattern search algorithm, filter algorithm, nonlinear constraints Abstract: This paper describes the optimization of a loadbearing thermal insulation system characterized by hot and cold surfaces with a series of heat intercepts and insulators betw ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
(Show Context)
categorical variables, mixed variable programming, pattern search algorithm, filter algorithm, nonlinear constraints Abstract: This paper describes the optimization of a loadbearing thermal insulation system characterized by hot and cold surfaces with a series of heat intercepts and insulators between them. The optimization problem is represented as a mixed variable programming (MVP) problem with nonlinear constraints, in which the objective is to minimize the power required to maintain the heat intercepts at fixed temperatures so that one surface is kept sufficiently cold. MVP problems are more general than mixed integer nonlinear programming (MINLP) problems in that the discrete variables are categorical; i.e., they must always take on values from a predefined enumerable set or list. Thus, traditional approaches that use branch and bound techniques cannot be applied. In a previous paper, a linearly constrained version of this problem was solved numerically using the AudetDennis generalized pattern search (GPS) method for MVP problems. However, this algorithm may not work for problems with general nonlinear constraints. A new algorithm that extends that of Audet and Dennis by incorporating a filter to handle nonlinear constraints makes it possible to solve the more general problem. Additional nonlinear constraints on stress, May 5, 2003 2 mass, and thermal contraction are added to that of the previous work in an effort to find a more realistic feasible design. Several computational experiments show a substantial improvement in power required to maintain the system, as compared to the previous literature. The addition of the new constraints leads to a very different design without significantly changing the power required. The results demonstrate that the new algorithm can be applied to a very broad class of optimization problems, for which no previous algorithm with provable convergence results could be applied. 1