Results 1  10
of
497
A Bayesian Framework for the Analysis of Microarray Expression Data: Regularized tTest and Statistical Inferences of Gene Changes
 Bioinformatics
, 2001
"... Motivation: DNA microarrays are now capable of providing genomewide patterns of gene expression across many different conditions. The first level of analysis of these patterns requires determining whether observed differences in expression are significant or not. Current methods are unsatisfactory ..."
Abstract

Cited by 491 (6 self)
 Add to MetaCart
Motivation: DNA microarrays are now capable of providing genomewide patterns of gene expression across many different conditions. The first level of analysis of these patterns requires determining whether observed differences in expression are significant or not. Current methods are unsatisfactory due to the lack of a systematic framework that can accommodate noise, variability, and low replication often typical of microarray data. Results: We develop a Bayesian probabilistic framework for microarray data analysis. At the simplest level, we model logexpression values by independent normal distributions, parameterized by corresponding means and variances with hierarchical prior distributions. We derive point estimates for both parameters and hyperparameters, and regularized expressions for the variance of each gene by combining the empirical variance with a local background variance associated with neighboring genes. An additional hyperparameter, inversely related to the number of empirical observations, determines the strength of the background variance. Simulations show that these point estimates, combined with a ttest, provide a systematic inference approach that compares favorably with simple ttest or fold methods, and partly compensate for the lack of replication. Availability: The approach is implemented in a software called CyberT accessible through a Web interface at www.genomics.uci.edu/software.html. The code is available as Open Source and is written in the freely available statistical language R. and Department of Biological Chemistry, College of Medicine, University of California, Irvine. To whom all correspondence should be addressed. Contact: pfbaldi@ics.uci.edu, tdlong@uci.edu. 1
Using simulation methods for Bayesian econometric models: Inference, development and communication
 Econometric Review
, 1999
"... This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and the implementation of those principles using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. Moving beyond conditioning on a ..."
Abstract

Cited by 355 (16 self)
 Add to MetaCart
This paper surveys the fundamental principles of subjective Bayesian inference in econometrics and the implementation of those principles using posterior simulation methods. The emphasis is on the combination of models and the development of predictive distributions. Moving beyond conditioning on a fixed number of completely specified models, the paper introduces subjective Bayesian tools for formal comparison of these models with as yet incompletely specified models. The paper then shows how posterior simulators can facilitate communication between investigators (for example, econometricians) on the one hand and remote clients (for example, decision makers) on the other, enabling clients to vary the prior distributions and functions of interest employed by investigators. A theme of the paper is the practicality of subjective Bayesian methods. To this end, the paper describes publicly available software for Bayesian inference, model development, and communication and provides illustrations using two simple econometric models. *This paper was originally prepared for the Australasian meetings of the Econometric Society in Melbourne, Australia,
Bayesian Model Averaging for Linear Regression Models
 Journal of the American Statistical Association
, 1997
"... We consider the problem of accounting for model uncertainty in linear regression models. Conditioning on a single selected model ignores model uncertainty, and thus leads to the underestimation of uncertainty when making inferences about quantities of interest. A Bayesian solution to this problem in ..."
Abstract

Cited by 326 (17 self)
 Add to MetaCart
(Show Context)
We consider the problem of accounting for model uncertainty in linear regression models. Conditioning on a single selected model ignores model uncertainty, and thus leads to the underestimation of uncertainty when making inferences about quantities of interest. A Bayesian solution to this problem involves averaging over all possible models (i.e., combinations of predictors) when making inferences about quantities of
Bayesian Experimental Design: A Review
 Statistical Science
, 1995
"... This paper reviews the literature on Bayesian experimental design, both for linear and nonlinear models. A unified view of the topic is presented by putting experimental design in a decision theoretic framework. This framework justifies many optimality criteria, and opens new possibilities. Various ..."
Abstract

Cited by 320 (1 self)
 Add to MetaCart
This paper reviews the literature on Bayesian experimental design, both for linear and nonlinear models. A unified view of the topic is presented by putting experimental design in a decision theoretic framework. This framework justifies many optimality criteria, and opens new possibilities. Various design criteria become part of a single, coherent approach.
Model Choice: A Minimum Posterior Predictive Loss Approach
, 1998
"... Model choice is a fundamental and much discussed activity in the analysis of data sets. Hierarchical models introducing random effects can not be handled by classical methods. Bayesian approaches using predictive distributions can, though the formal solution, which includes Bayes factors as a specia ..."
Abstract

Cited by 132 (13 self)
 Add to MetaCart
Model choice is a fundamental and much discussed activity in the analysis of data sets. Hierarchical models introducing random effects can not be handled by classical methods. Bayesian approaches using predictive distributions can, though the formal solution, which includes Bayes factors as a special case, can be criticized. We propose a predictive criterion where the goal is good prediction of a replicate of the observed data but tempered by fidelity to the observed values. We obtain this criterion by minimizing posterior loss for a given model and then, for models under consideration, select the one which minimizes this criterion. For a broad range of losses, the criterion emerges approximately as a form partitioned into a goodnessoffit term and a penalty term. In the context of generalized linear mixed effects models we obtain a penalized deviance criterion comprised of a piece which is a Bayesian deviance measure and a piece which is a penalty for model complexity. We illustrate ...
The Contributions of the Economics of Information to Twentieth
 Century Economics,” Quarterly Journal of Economics
, 2000
"... In the field of economics, perhaps the most important break with the past—one that leaves open huge areas for future work—lies in the economics of information. It is now recognized that information is imperfect, obtaining information can be costly, there are important asymmetries of information, an ..."
Abstract

Cited by 120 (0 self)
 Add to MetaCart
In the field of economics, perhaps the most important break with the past—one that leaves open huge areas for future work—lies in the economics of information. It is now recognized that information is imperfect, obtaining information can be costly, there are important asymmetries of information, and the extent of information asymmetries is affected by actions of firms and individuals. This recognition deeply affects the understanding of wisdom inherited from the past, such as the fundamental welfare theorem and some of the basic characterization of a market economy, and provides explanations of economic and social phenomena that otherwise would be hard to understand. I.
From Stochastic Dominance to MeanRisk Models: Semideviations as Risk Measures
, 1997
"... Two methods are frequently used for modeling the choice among uncertain outcomes: stochastic dominance and mean–risk approaches. The former is based on an axiomatic model of riskaverse preferences but does not provide a convenient computational recipe. The latter quantifies the problem in a lucid f ..."
Abstract

Cited by 109 (21 self)
 Add to MetaCart
Two methods are frequently used for modeling the choice among uncertain outcomes: stochastic dominance and mean–risk approaches. The former is based on an axiomatic model of riskaverse preferences but does not provide a convenient computational recipe. The latter quantifies the problem in a lucid form of two criteria with possible tradeoff analysis, but cannot model all riskaverse preferences. In particular, if variance is used as a measure of risk, the resulting mean–variance (Markowitz) model is, in general, not consistent with stochastic dominance rules. This paper shows that the standard semideviation (square root of the semivariance) as the risk measure makes the mean–risk model consistent with the second degree stochastic dominance, provided that the tradeoff coefficient is bounded by a certain constant. Similar results are obtained for the absolute semideviation, and for the absolute and standard deviations in the case of symmetric or bounded distributions. In the analysis we use a new tool, the Outcome–Risk diagram,
Eliciting Informative Feedback: The PeerPrediction Method
 Management Science
, 2005
"... informs ® doi 10.1287/mnsc.1050.0379 ..."
Decision Theory in Expert Systems and Artificial Intelligence
 International Journal of Approximate Reasoning
, 1988
"... Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decision ..."
Abstract

Cited by 105 (19 self)
 Add to MetaCart
(Show Context)
Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decisiontheoretic framework. Recent analyses of the restrictions of several traditional AI reasoning techniques, coupled with the development of more tractable and expressive decisiontheoretic representation and inference strategies, have stimulated renewed interest in decision theory and decision analysis. We describe early experience with simple probabilistic schemes for automated reasoning, review the dominant expertsystem paradigm, and survey some recent research at the crossroads of AI and decision science. In particular, we present the belief network and influence diagram representations. Finally, we discuss issues that have not been studied in detail within the expertsystems sett...
Optimization of Convex Risk Functions
, 2004
"... We consider optimization problems involving convex risk functions. By employing techniques of convex analysis and optimization theory in vector spaces of measurable functions we develop new representation theorems for risk models, and optimality and duality theory for problems involving risk functio ..."
Abstract

Cited by 103 (14 self)
 Add to MetaCart
(Show Context)
We consider optimization problems involving convex risk functions. By employing techniques of convex analysis and optimization theory in vector spaces of measurable functions we develop new representation theorems for risk models, and optimality and duality theory for problems involving risk functions.