Results 1  10
of
11
An Efficient Variable Selection Approach for Analyzing Designed Experiments
, 2005
"... The analysis of experiments where a large number of potential variables are examined is driven by the principles of effect sparsity, effect hierarchy, and effect heredity. We propose an efficient variable selection strategy to specifically address the unique challenges faced by such analysis. The pr ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
The analysis of experiments where a large number of potential variables are examined is driven by the principles of effect sparsity, effect hierarchy, and effect heredity. We propose an efficient variable selection strategy to specifically address the unique challenges faced by such analysis. The proposed methods are natural extensions of a generalpurpose variable selection algorithm, LARS (Efron et al., 2004). They are very fast to compute and can find sparse models that better satisfy the goals of experiments. Simulations and real examples are used to illustrate the wide applicability of the proposed methods.
Functionally Induced Priors for the Analysis of Experiments
 Technometrics
, 2007
"... This work extends and develops the idea of using functional priors for the design and analysis of three and higher level experiments. Developing a prior distribution for model parameters is challenging because a factor can be qualitative or quantitative. We propose appropriate correlation functions ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
(Show Context)
This work extends and develops the idea of using functional priors for the design and analysis of three and higher level experiments. Developing a prior distribution for model parameters is challenging because a factor can be qualitative or quantitative. We propose appropriate correlation functions and coding schemes so that the prior distribution is simple and the results interpretable. The prior incorporates well known principles such as effect hierarchy and effect heredity, which helps to resolve the aliasing problems in fractional designs almost automatically. The usefulness of the new approach is illustrated through the analysis of some real experiments.
Statistical Adjustments to Engineering Models
, 2008
"... A common problem in engineering is that the models developed based on the underlying physics of the process do not always match satisfactorily with reality. On the other hand, pure statistical models fitted based on experimental data can give more realistic predictions, but can perform poorly when p ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
A common problem in engineering is that the models developed based on the underlying physics of the process do not always match satisfactorily with reality. On the other hand, pure statistical models fitted based on experimental data can give more realistic predictions, but can perform poorly when predictions are made away from the observed data points. This article proposes engineeringstatistical models that combine the advantages of engineering models and statistical models. The engineeringstatistical model is obtained through some adjustments to the engineering model using experimental data. The adjustments are done in a sequential way and are based on empirical Bayes methods. We also develop approximate frequentist procedures for adjustments that are computationally much easier to implement. The usefulness of the methodology is illustrated using a problem of predicting surface roughness in a microcutting process.
Analysis of Optimization Experiments
"... The typical practice for analyzing industrial experiments is to identify statistically significant effects with a 5 % level of significance and then to optimize the model containing only those effects. In this article, we illustrate the danger in utilizing this approach. We propose methodology using ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
The typical practice for analyzing industrial experiments is to identify statistically significant effects with a 5 % level of significance and then to optimize the model containing only those effects. In this article, we illustrate the danger in utilizing this approach. We propose methodology using the practical significance level, which is a quantity that a practitioner can easily specify. We also propose utilizing empirical Bayes estimation which gives shrinkage estimates of the effects. Interestingly, the mechanics of statistical testing can be viewed as an approximation to empirical Bayes estimation, but with a significance level in the range of 1540%. We also establish the connections that our approach has with a less known but intriguing technique proposed by Taguchi, known as the beta coefficient method. A real example and simulations are used to demonstrate the advantages of the proposed methodology.
Bayesian Optimal Single Arrays for Robust Parameter Design
, 2008
"... It is critical to estimate controlbynoise interactions in robust parameter design. This can be achieved by using a cross array, which is a cross product of a design for control factors and another design for noise factors. However, the total run size of such arrays can be prohibitively large. To r ..."
Abstract
 Add to MetaCart
It is critical to estimate controlbynoise interactions in robust parameter design. This can be achieved by using a cross array, which is a cross product of a design for control factors and another design for noise factors. However, the total run size of such arrays can be prohibitively large. To reduce the run size, single arrays are proposed in the literature, where a modified effect hierarchy principle is used for the optimal selection of the arrays. In this article, we argue that effect hierarchy is a property of the system and cannot be altered depending on the objective of an experiment. We propose a Bayesian approach to develop single arrays which incorporate the importance of controlbynoise interactions without altering the effect hierarchy. The approach is very general and places no restrictions on the number of runs or levels or type of factors or type of designs. A modified exchange algorithm is proposed for finding the optimal single arrays. We also explain how to design experiments with internal noise factors. The advantages of the proposed approach are illustrated using several examples.
Bayesianinspired mixed two and fourlevel designs
"... Motivated by a Bayesian framework, we propose a new minimum aberration type criterion for designing experiments with two and fourlevel factors. The Bayesian approach helps in overcoming the ad hoc nature of effect ordering in the existing minimum aberration type criteria. Moreover, the approach is ..."
Abstract
 Add to MetaCart
Motivated by a Bayesian framework, we propose a new minimum aberration type criterion for designing experiments with two and fourlevel factors. The Bayesian approach helps in overcoming the ad hoc nature of effect ordering in the existing minimum aberration type criteria. Moreover, the approach is also capable of distinguishing between qualitative and quantitative factors. Numerous examples are given to demonstrate the advantages of the proposed approach.
Blind Kriging: A New Method for Developing
"... Kriging is a useful method for developing metamodels for product design optimization. The most popular kriging method, known as ordinary kriging, uses a constant mean in the model. In this article, a modified kriging method is proposed, which has an unknown mean model. Therefore it is called blind k ..."
Abstract
 Add to MetaCart
Kriging is a useful method for developing metamodels for product design optimization. The most popular kriging method, known as ordinary kriging, uses a constant mean in the model. In this article, a modified kriging method is proposed, which has an unknown mean model. Therefore it is called blind kriging. The unknown mean model is identified from experimental data using a Bayesian variable selection technique. Many examples are presented which show remarkable improvement in prediction using blind kriging over ordinary kriging. Moreover, blind kriging predictor is easier to interpret and seems to be more robust to misspecification in the correlation parameters.
Contents
"... The evaluation of aerospace designs is synonymous with the use of long running and computationally intensive simulations. This fuels the desire to harness the efficiency of surrogatebased methods in aerospace design optimization. Recent advances in surrogatebased design methodology bring the promi ..."
Abstract
 Add to MetaCart
(Show Context)
The evaluation of aerospace designs is synonymous with the use of long running and computationally intensive simulations. This fuels the desire to harness the efficiency of surrogatebased methods in aerospace design optimization. Recent advances in surrogatebased design methodology bring the promise of efficient global optimization closer to reality. We review the present state of the art of constructing surrogate models and their use in optimization strategies. We make extensive use of pictorial examples and, since no method is truly universal,
Convex Modeling of Interactions with Strong Heredity
, 2014
"... We consider the task of fitting a regression model involving interactions among a potentially large set of covariates, in which we wish to enforce strong heredity. We propose FAMILY, a very general framework for this task. Our proposal is a generalization of several existing methods, such as VANISH ..."
Abstract
 Add to MetaCart
We consider the task of fitting a regression model involving interactions among a potentially large set of covariates, in which we wish to enforce strong heredity. We propose FAMILY, a very general framework for this task. Our proposal is a generalization of several existing methods, such as VANISH [Radchenko and James, 2010], hierNet [Bien et al., 2013], the allpairs lasso, and the lasso using only main effects. It can be formulated as the solution to a convex optimization problem, which we solve using an efficient alternating directions method of multipliers (ADMM) algorithm. This algorithm has guaranteed convergence to the global optimum, can be easily specialized to any convex penalty function of interest, and allows for a straightforward extension to the setting of generalized linear models. We derive an unbiased estimator of the degrees of freedom of FAMILY, and explore its performance in a simulation study and on an HIV sequence data set.