Results 1  10
of
48,177
Quantal Response Equilibria For Normal Form Games
 NORMAL FORM GAMES, GAMES AND ECONOMIC BEHAVIOR
, 1995
"... We investigate the use of standard statistical models for quantal choice in a game theoretic setting. Players choose strategies based on relative expected utility, and assume other players do so as well. We define a Quantal Response Equilibrium (QRE) as a fixed point of this process, and establish e ..."
Abstract

Cited by 647 (28 self)
 Add to MetaCart
We investigate the use of standard statistical models for quantal choice in a game theoretic setting. Players choose strategies based on relative expected utility, and assume other players do so as well. We define a Quantal Response Equilibrium (QRE) as a fixed point of this process, and establish
Exploration, normalization, and summaries of high density oligonucleotide array probe level data.
 Biostatistics,
, 2003
"... SUMMARY In this paper we report exploratory analyses of highdensity oligonucleotide array data from the Affymetrix GeneChip R system with the objective of improving upon currently used measures of gene expression. Our analyses make use of three data sets: a small experimental study consisting of f ..."
Abstract

Cited by 854 (33 self)
 Add to MetaCart
familiar features of the perfect match and mismatch probe (P M and M M) values of these data, and examine the variancemean relationship with probelevel data from probes believed to be defective, and so delivering noise only. We explain why we need to normalize the arrays to one another using probe level
Modeling and Forecasting Realized Volatility
, 2002
"... this paper is built. First, although raw returns are clearly leptokurtic, returns standardized by realized volatilities are approximately Gaussian. Second, although the distributions of realized volatilities are clearly rightskewed, the distributions of the logarithms of realized volatilities are a ..."
Abstract

Cited by 549 (50 self)
 Add to MetaCart
frequency models, we find that our simple Gaussian VAR forecasts generally produce superior forecasts. Furthermore, we show that, given the theoretically motivated and empirically plausible assumption of normally distributed returns conditional on the realized volatilities, the resulting lognormalnormal mixture
Speaker verification using Adapted Gaussian mixture models
 Digital Signal Processing
, 2000
"... In this paper we describe the major elements of MIT Lincoln Laboratoryâ€™s Gaussian mixture model (GMM)based speaker verification system used successfully in several NIST Speaker Recognition Evaluations (SREs). The system is built around the likelihood ratio test for verification, using simple but ef ..."
Abstract

Cited by 1010 (42 self)
 Add to MetaCart
but effective GMMs for likelihood functions, a universal background model (UBM) for alternative speaker representation, and a form of Bayesian adaptation to derive speaker models from the UBM. The development and use of a handset detector and score normalization to greatly improve verification performance
Improved methods for building protein models in electron density maps and the location of errors in these models. Acta Crystallogr. sect
 A
, 1991
"... Map interpretation remains a critical step in solving the structure of a macromolecule. Errors introduced at this early stage may persist throughout crystallographic refinement and result in an incorrect structure. The normally quoted crystallographic residual is often a poor description for the q ..."
Abstract

Cited by 1051 (9 self)
 Add to MetaCart
Map interpretation remains a critical step in solving the structure of a macromolecule. Errors introduced at this early stage may persist throughout crystallographic refinement and result in an incorrect structure. The normally quoted crystallographic residual is often a poor description
National debt in a neoclassical growth model.'
 American Economic Review,
"... This paper contains a model designed to serve two purposes, to examine longrun competitive equilibrium in a growth model and then to explore the effects on this equilibrium of government debt. Samuelson [8] has examined the determination of interest rates in a singlecommodity world without durable ..."
Abstract

Cited by 698 (0 self)
 Add to MetaCart
This paper contains a model designed to serve two purposes, to examine longrun competitive equilibrium in a growth model and then to explore the effects on this equilibrium of government debt. Samuelson [8] has examined the determination of interest rates in a singlecommodity world without
A model for technical inefficiency effects in a stochastic frontier production function for panel data
 Empirical Economics
, 1995
"... Abstract: A stochastic frontier production function is defined for panel data on firms, in which the nonnegative technical inetGciency effects are assumed to be a function of firmspecific variables and time. The inefficiency effects are assumed to be independently distributed as truncations of nor ..."
Abstract

Cited by 555 (4 self)
 Add to MetaCart
of normal distributions with constant variance, but with means which are a linear function of observable variables. This panel data model is an extension of recently proposed models for inefTiciency effects in stochastic frontiers for crosssectional data. An empirical application of the model is obtained
An analysis of transformations
 Journal of the Royal Statistical Society. Series B (Methodological
, 1964
"... In the analysis of data it is often assumed that observations y,, y,,...,y, are independently normally distributed with constant variance and with expectations specified by a model linear in a set of parameters 0. In this paper we make the less restrictive assumption that such a normal, homoscedasti ..."
Abstract

Cited by 1067 (3 self)
 Add to MetaCart
In the analysis of data it is often assumed that observations y,, y,,...,y, are independently normally distributed with constant variance and with expectations specified by a model linear in a set of parameters 0. In this paper we make the less restrictive assumption that such a normal
Evaluating the Accuracy of SamplingBased Approaches to the Calculation of Posterior Moments
 IN BAYESIAN STATISTICS
, 1992
"... Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accurac ..."
Abstract

Cited by 604 (12 self)
 Add to MetaCart
accuracy of the approximations to the expected value of functions of interest under the posterior. In this paper methods from spectral analysis are used to evaluate numerical accuracy formally and construct diagnostics for convergence. These methods are illustrated in the normal linear model
Bayesian density estimation and inference using mixtures.
 J. Amer. Statist. Assoc.
, 1995
"... JSTOR is a notforprofit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about J ..."
Abstract

Cited by 653 (18 self)
 Add to MetaCart
, assessment of modality, and the inference on the numbers of components. Also, convergence results are established for a general class of normal mixture models. American Statistical Association
Results 1  10
of
48,177