Results 11  20
of
3,342,991
Quantal Response Equilibria For Normal Form Games
 NORMAL FORM GAMES, GAMES AND ECONOMIC BEHAVIOR
, 1995
"... We investigate the use of standard statistical models for quantal choice in a game theoretic setting. Players choose strategies based on relative expected utility, and assume other players do so as well. We define a Quantal Response Equilibrium (QRE) as a fixed point of this process, and establish e ..."
Abstract

Cited by 632 (28 self)
 Add to MetaCart
We investigate the use of standard statistical models for quantal choice in a game theoretic setting. Players choose strategies based on relative expected utility, and assume other players do so as well. We define a Quantal Response Equilibrium (QRE) as a fixed point of this process, and establish
COMPARISON OF STATISTICAL MODELS FOR CPU E STANDARDIZATION BY INFORMATION CRITERIA POISSON MODEL VS. LOGNORMAL MODEL
"... The analysis by generalized linear model (GLM) has been used for the standardization of CPUE, recently. Such calculation has usually been performed through GLM and/or GENMOD procedure of SAS/STAT package assuming that CPUE model with lognormal distribution and/or catch one with Poisson distribution ..."
Abstract
 Add to MetaCart
The analysis by generalized linear model (GLM) has been used for the standardization of CPUE, recently. Such calculation has usually been performed through GLM and/or GENMOD procedure of SAS/STAT package assuming that CPUE model with lognormal distribution and/or catch one with Poisson
Normalization for cDNA microarray data: a robust composite method addressing single and multiple slide systematic variation
, 2002
"... There are many sources of systematic variation in cDNA microarray experiments which affect the measured gene expression levels (e.g. differences in labeling efficiency between the two fluorescent dyes). The term normalization refers to the process of removing such variation. A constant adjustment is ..."
Abstract

Cited by 699 (9 self)
 Add to MetaCart
is often used to force the distribution of the intensity log ratios to have a median of zero for each slide. However, such global normalization approaches are not adequate in situations where dye biases can depend on spot overall intensity and/or spatial location within the array. This article proposes
Modeling and Forecasting Realized Volatility
, 2002
"... this paper is built. First, although raw returns are clearly leptokurtic, returns standardized by realized volatilities are approximately Gaussian. Second, although the distributions of realized volatilities are clearly rightskewed, the distributions of the logarithms of realized volatilities are a ..."
Abstract

Cited by 537 (50 self)
 Add to MetaCart
frequency models, we find that our simple Gaussian VAR forecasts generally produce superior forecasts. Furthermore, we show that, given the theoretically motivated and empirically plausible assumption of normally distributed returns conditional on the realized volatilities, the resulting lognormalnormal mixture
Bayesian Analysis of Stochastic Volatility Models
, 1994
"... this article is to develop new methods for inference and prediction in a simple class of stochastic volatility models in which logarithm of conditional volatility follows an autoregressive (AR) times series model. Unlike the autoregressive conditional heteroscedasticity (ARCH) and gener alized ARCH ..."
Abstract

Cited by 587 (25 self)
 Add to MetaCart
ARCH (GARCH) models [see Bollerslev, Chou, and Kroner (1992) for a survey of ARCH modeling], both the mean and logvolatility equations have separate error terms. The ease of evaluating the ARCH likelihood function and the ability of the ARCH specification to accommodate the timevarying volatility
Speaker verification using Adapted Gaussian mixture models
 Digital Signal Processing
, 2000
"... In this paper we describe the major elements of MIT Lincoln Laboratory’s Gaussian mixture model (GMM)based speaker verification system used successfully in several NIST Speaker Recognition Evaluations (SREs). The system is built around the likelihood ratio test for verification, using simple but ef ..."
Abstract

Cited by 981 (42 self)
 Add to MetaCart
but effective GMMs for likelihood functions, a universal background model (UBM) for alternative speaker representation, and a form of Bayesian adaptation to derive speaker models from the UBM. The development and use of a handset detector and score normalization to greatly improve verification performance
A model of growth through creative destruction
, 1990
"... This paper develops a model based on Schumpeter's process of creative destruction. It departs from existing models of endogeneous growth in emphasizing obsolescence of old technologies induced by the accumulation of knowledge and the resulting process or industrial innovations. This has both ..."
Abstract

Cited by 1914 (26 self)
 Add to MetaCart
the log of GNP follows a random walk with drift. The size of the drift is the average growth rate of the economy and it is endogeneous to the model; in particular it depends on the size and likilihood of innovations resulting from research and also on the degree of market power available to an innovator.
Improved methods for building protein models in electron density maps and the location of errors in these models. Acta Crystallogr. sect
 A
, 1991
"... Map interpretation remains a critical step in solving the structure of a macromolecule. Errors introduced at this early stage may persist throughout crystallographic refinement and result in an incorrect structure. The normally quoted crystallographic residual is often a poor description for the q ..."
Abstract

Cited by 1030 (9 self)
 Add to MetaCart
Map interpretation remains a critical step in solving the structure of a macromolecule. Errors introduced at this early stage may persist throughout crystallographic refinement and result in an incorrect structure. The normally quoted crystallographic residual is often a poor description
National debt in a neoclassical growth model.'
 American Economic Review,
"... This paper contains a model designed to serve two purposes, to examine longrun competitive equilibrium in a growth model and then to explore the effects on this equilibrium of government debt. Samuelson [8] has examined the determination of interest rates in a singlecommodity world without durable ..."
Abstract

Cited by 678 (0 self)
 Add to MetaCart
This paper contains a model designed to serve two purposes, to examine longrun competitive equilibrium in a growth model and then to explore the effects on this equilibrium of government debt. Samuelson [8] has examined the determination of interest rates in a singlecommodity world without
Graphs over Time: Densification Laws, Shrinking Diameters and Possible Explanations
, 2005
"... How do real graphs evolve over time? What are “normal” growth patterns in social, technological, and information networks? Many studies have discovered patterns in static graphs, identifying properties in a single snapshot of a large network, or in a very small number of snapshots; these include hea ..."
Abstract

Cited by 532 (48 self)
 Add to MetaCart
increase slowly as a function of the number of nodes (like O(log n) orO(log(log n)). Existing graph generation models do not exhibit these types of behavior, even at a qualitative level. We provide a new graph generator, based on a “forest fire” spreading process, that has a simple, intuitive justification
Results 11  20
of
3,342,991