Results 1  10
of
134
A Comprehensive Survey of Fitness Approximation in Evolutionary Computation
, 2003
"... Evolutionary algorithms (EAs) have received increasing interests both in the academy and industry. One main difficulty in applying EAs to realworld applications is that EAs usually need a large number of fitness evaluations before a satisfying result can be obtained. However, fitness evaluations ar ..."
Abstract

Cited by 174 (11 self)
 Add to MetaCart
Evolutionary algorithms (EAs) have received increasing interests both in the academy and industry. One main difficulty in applying EAs to realworld applications is that EAs usually need a large number of fitness evaluations before a satisfying result can be obtained. However, fitness evaluations are not always straightforward in many realworld applications. Either an explicit fitness function does not exist, or the evaluation of the fitness is computationally very expensive. In both cases, it is necessary to estimate the fitness function by constructing an approximate model. In this paper, a comprehensive survey of the research on fitness approximation in evolutionary computation is presented. Main issues like approximation levels, approximate model management schemes, model construction techniques are reviewed. To conclude, open questions and interesting issues in the field are discussed.
SurrogateAssisted Evolutionary Optimization Frameworks for HighFidelity Engineering Design Problems
 In Knowledge Incorporation in Evolutionary Computation
, 2004
"... Over the last decade, Evolutionary Algorithms (EAs) have emerged as a powerful paradigm for global optimization of multimodal functions. More recently, there has been significant interest in applying EAs to engineering design problems. However, in many complex engineering design problems where high ..."
Abstract

Cited by 31 (6 self)
 Add to MetaCart
(Show Context)
Over the last decade, Evolutionary Algorithms (EAs) have emerged as a powerful paradigm for global optimization of multimodal functions. More recently, there has been significant interest in applying EAs to engineering design problems. However, in many complex engineering design problems where highfidelity analysis models are used, each function evaluation may require a Computational Structural Mechanics (CSM), Computational Fluid Dynamics (CFD) or Computational ElectroMagnetics (CEM) simulation costing minutes to hours of supercomputer time. Since EAs typically require thousands of function evaluations to locate a near optimal solution, the use of EAs often becomes computationally prohibitive for this class of problems. In this paper, we present frameworks that employ surrogate models for solving computationally expensive optimization problems on a limited computational budget. In particular, the key factors responsible for the success of these frameworks are discussed. Experimental results obtained on benchmark test functions and realworld complex design problems are presented.
Design optimization of hierarchically decomposed multilevel system under uncertainty
 Proceedings of the ASME 2004 Design Engineering Technical Conferences, Salt Lake City, Utah, 28 September–2 October, DETC2004/DAC57357
, 2004
"... This paper presents a methodology for design optimization of decomposed systems in the presence of uncertainties. We extend the analytical target cascading (ATC) formulation to probabilistic design by treating stochastic quantities as random variables and parameters and posing reliabilitybased desi ..."
Abstract

Cited by 26 (12 self)
 Add to MetaCart
(Show Context)
This paper presents a methodology for design optimization of decomposed systems in the presence of uncertainties. We extend the analytical target cascading (ATC) formulation to probabilistic design by treating stochastic quantities as random variables and parameters and posing reliabilitybased design constraints. We model the propagation of uncertainty throughout the multilevel hierarchy of elements that comprise the decomposed system by using the advanced mean value (AMV) method to generate the required probability distributions of nonlinear responses. We utilize appropriate metamodeling techniques for simulationbased design problems. A simple yet illustrative hierarchical bilevel engine design problem is used to demonstrate the proposed methodology. 1
Design and Analysis of Computer Experiments in Multidisciplinary Design Optimization: A Review of How Far We Have Come – or Not
"... The use of metamodeling techniques in the design and analysis of computer experiments has progressed remarkably in the past two decades, but how far have we really come? This is the question that we investigate in this paper, namely, the extent to which the use of metamodeling techniques in multidi ..."
Abstract

Cited by 26 (1 self)
 Add to MetaCart
The use of metamodeling techniques in the design and analysis of computer experiments has progressed remarkably in the past two decades, but how far have we really come? This is the question that we investigate in this paper, namely, the extent to which the use of metamodeling techniques in multidisciplinary design optimization have evolved in the two decades since the seminal paper on Design and Analysis of Computer Experiments by Sacks et al. As part of this review, we examine the motivation for advancements in metamodeling techniques from both a historical perspective and the research itself. Based on current thrusts in the field, we emphasize multilevel/multifidelity approximations and ensembles of metamodels, as well as the availability of metamodels within commercial software and for design space exploration and visualization in this review. Our closing remarks offer insight into future research directions – nearly the same ones that have motivated us in the past.
The Use of Metamodeling Techniques for Optimization Under Uncertainty
 2001 ASME Design Automation Conference, Paper No. DAC21039
, 2001
"... Metamodeling techniques have been widely used in engineering design to improve the efficiency in simulation and optimization of design systems that involve computationally expensive simulation programs. Many existing applications are restricted to deterministic optimization. Very few studies have be ..."
Abstract

Cited by 25 (2 self)
 Add to MetaCart
Metamodeling techniques have been widely used in engineering design to improve the efficiency in simulation and optimization of design systems that involve computationally expensive simulation programs. Many existing applications are restricted to deterministic optimization. Very few studies have been conducted on studying the accuracy of using metamodels for optimization under uncertainty. In this paper, using a twobar structure system design as an example, various metamodeling techniques are tested for different formulations of optimization under uncertainty. Observations are made on the applicability and accuracy of these techniques, the impact of sample size, and the optimization performance when different formulations are used to incorporate uncertainty. Some important issues for applying metamodels to optimization under uncertainty are discussed.
Adaptive Response Surface Method Using Inherited Latin Hypercube Design Points
 Transactions of the ASME, Journal of Mechanical Design
, 2003
"... This paper addresses the difficulty of the previously developed Adaptive Response Surface Method (ARSM) for highdimensional design problems. The ARSM was developed to search for the global design optimum for computationintensive design problems. This method utilizes Central Composite Design (CCD), ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
(Show Context)
This paper addresses the difficulty of the previously developed Adaptive Response Surface Method (ARSM) for highdimensional design problems. The ARSM was developed to search for the global design optimum for computationintensive design problems. This method utilizes Central Composite Design (CCD), which results in an exponentially increasing number of required design experiments. In addition, the ARSM generates a complete new set of CCD samples in a gradually reduced design space. These two factors greatly undermine the efficiency of the ARSM. In this work, Latin Hypercube Design (LHD) is utilized to generate saturated design experiments. Because of the use of LHD, historical design experiments can be inherited in later iterations. As a result, ARSM only requires a limited number of design experiments even for highdimensional design problems. The improved ARSM is tested using a group of standard test problems and then applied to an engineering design problem. In both testing a d design application, significant improvement in the efficiency of ARSM is realized. The improved ARSM demonstrates strong potential to be a practical global optimization tool for computationintensive design problems. Inheriting LHD samples, as a g neral sampling strategy, can be integrated into other approximationbased design optimization methodologies.
Agus Sudjianto. Analysis of computer experiments using penalized likelihood in gaussian kriging models
 Journal of the American Statistical Association
"... Kriging is a popular analysis approach for computer experiments for the purpose of creating a cheaptocompute “metamodel ” as a surrogate to a computationally expensive engineering simulation model. The maximum likelihood approach is used to estimate the parameters in the kriging model. However, t ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
(Show Context)
Kriging is a popular analysis approach for computer experiments for the purpose of creating a cheaptocompute “metamodel ” as a surrogate to a computationally expensive engineering simulation model. The maximum likelihood approach is used to estimate the parameters in the kriging model. However, the likelihood function near the optimum may be flat in some situations, which leads to maximum likelihood estimates for the parameters in the covariance matrix that have very large variance. To overcome this difficulty, a penalized likelihood approach is proposed for the kriging model. Both theoretical analysis and empirical experience using real world data suggest that the proposed method is particularly important in the context of a computationally intensive simulation model where the number of simulation runs must be kept small because collection of a large sample set is prohibitive. The proposed approach is applied to the reduction of piston slap, an unwanted engine noise due to piston secondary motion. Issues related to practical implementation of the proposed approach are discussed.
StateoftheArt Review: A User’s Guide to the Brave New World of Designing Simulation Experiments
 INFORMS Journal on Computing
, 2005
"... informs ® doi 10.1287/ijoc.1050.0136 © 2005 INFORMS Many simulation practitioners can get more from their analyses by using the statistical theory on design of experiments (DOE) developed specifically for exploring computer models. We discuss a toolkit of designs for simulators with limited DOE expe ..."
Abstract

Cited by 23 (4 self)
 Add to MetaCart
informs ® doi 10.1287/ijoc.1050.0136 © 2005 INFORMS Many simulation practitioners can get more from their analyses by using the statistical theory on design of experiments (DOE) developed specifically for exploring computer models. We discuss a toolkit of designs for simulators with limited DOE expertise who want to select a design and an appropriate analysis for their experiments. Furthermore, we provide a research agenda listing problems in the design of simulation experiments—as opposed to realworld experiments—that require more investigation. We consider three types of practical problems: (1) developing a basic understanding of a particular simulation model or system, (2) finding robust decisions or policies as opposed to socalled optimal solutions, and (3) comparing the merits of various decisions or policies. Our discussion emphasizes aspects that are typical for simulation, such as having many more factors than in realworld experiments, and the sequential nature of the data collection. Because the same problem type may be addressed through different design types, we discuss quality attributes of designs, such as the ease of design construction, the flexibility for analysis, and efficiency considerations. Moreover, the selection of the design type depends on the metamodel (response surface) that the analysts tentatively assume; for
Adaptive Experimental Design For Construction Of Response Surface Approximations
, 2001
"... Sequential Approximate Optimization (SAO) is a class of methods available for the multidisciplinary design optimization (MDO) of complex systems that are composed of several disciplines coupled together. One of the approaches used for SAO, is based on a quadratic response surface approximation, wher ..."
Abstract

Cited by 21 (10 self)
 Add to MetaCart
Sequential Approximate Optimization (SAO) is a class of methods available for the multidisciplinary design optimization (MDO) of complex systems that are composed of several disciplines coupled together. One of the approaches used for SAO, is based on a quadratic response surface approximation, where zero and first order information are required. In these methods, designers must generate and query a database of order O(n²) in order to compute the second order terms of the quadratic response surface approximation. As the number of design variables grows, the computational cost of generating the required database becomes a concern. In this paper, we present an new approach in which we require just O(n) parameters for constructing a second order approximation. This is accomplished by transforming the matrix of second order terms into the canonical form. The method periodically requires an order O(n²) update of the second order approximation to maintain accuracy. Results show
MODEL VALIDATION VIA UNCERTAINTY PROPAGATION USING RESPONSE Surface Models
 ASME 2002 DESIGN ENGINEERING TECHNICAL CONFERENCES MONTREAL, CANADA, SEPTEMBER 29  OCTOBER 2, 2002
, 2002
"... Model validation has become a primary means to evaluate accuracy and reliability of computational simulations in engineering design. Mathematical models enable engineers to establish what the most likely response of a system is. However, despite the enormous power of computational models, uncertaint ..."
Abstract

Cited by 18 (5 self)
 Add to MetaCart
Model validation has become a primary means to evaluate accuracy and reliability of computational simulations in engineering design. Mathematical models enable engineers to establish what the most likely response of a system is. However, despite the enormous power of computational models, uncertainty is inevitable in all modelbased engineering design problems, due to the variation in the physical system itself, or lack of knowledge, and the use of assumptions by model builders. Therefore, realistic mathematical models should contemplate uncertainties. Due to the uncertainties, the assessment of the validity of a modeling approach must be conducted based on stochastic measurements to provide designers with the confidence of using a model. In this paper, a generic model validation methodology via uncertainty propagation is presented. The approach reduces the number of physical testing at each design setting to one by shifting the evaluation effort to uncertainty propagation of the computational model. Response surface methodology is used to create metamodels as less costly approximations of simulation models for uncertainty propagation. The methodology is illustrated with the examination of the validity of a finiteelement analysis model for predicting springback angles in a sample flanging process.