Results 1  10
of
10
Evaluating the Accuracy of SamplingBased Approaches to the Calculation of Posterior Moments
 IN BAYESIAN STATISTICS
, 1992
"... Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accurac ..."
Abstract

Cited by 583 (14 self)
 Add to MetaCart
Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accuracy of the approximations to the expected value of functions of interest under the posterior. In this paper methods from spectral analysis are used to evaluate numerical accuracy formally and construct diagnostics for convergence. These methods are illustrated in the normal linear model with informative priors, and in the Tobitcensored regression model.
Approximations to the Loglikelihood Function in the Nonlinear Mixed Effects Model
 Journal of Computational and Graphical Statistics
, 1995
"... Introduction. Several different nonlinear mixed effects models and estimation methods for their parameters have been proposed in recent years (Sheiner and Beal, 1980; Mallet, Mentre, Steimer and Lokiek, 1988; Lindstrom and Bates, 1990; Vonesh and Carter, 1992; Davidian and Gallant, 1992; Wakefield, ..."
Abstract

Cited by 146 (4 self)
 Add to MetaCart
Introduction. Several different nonlinear mixed effects models and estimation methods for their parameters have been proposed in recent years (Sheiner and Beal, 1980; Mallet, Mentre, Steimer and Lokiek, 1988; Lindstrom and Bates, 1990; Vonesh and Carter, 1992; Davidian and Gallant, 1992; Wakefield, Smith, RacinePoon and Gelfand, 1994). We consider here a slightly modified version of the model proposed in Lindstrom and Bates (1990). This model can be viewed as a hierarchical model that in some ways generalizes both the linear mixed effects model of Laird and Ware (1982) and the usual nonlinear model for independent data (Bates and Watts, 1988). In the first stage the jth observation on the ith cluster is modeled as y ij = f(OE ij ; x ij ) + ffl ij ; i = 1; : : : ; M; j = 1; : : : ; n i<F
Statistical Inference for Multiple Choice Tests
 Psychometrika
, 1991
"... Finite sample inference procedures are considered for analyzing the observed scores on a multiple choice test with several items, where, for example, the items are dissimilar, or the item responses are correlated. A discrete pparameter exponential family model leads to a generalized linear model fr ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Finite sample inference procedures are considered for analyzing the observed scores on a multiple choice test with several items, where, for example, the items are dissimilar, or the item responses are correlated. A discrete pparameter exponential family model leads to a generalized linear model framework and, in a special case, a convenient regression of true score upon observed score. Techniques based upon the likelihood function, Akaike's information criteria (AtC), an approximate Bayesian marginalization procedure based on conditional maximization (BCM), and simulations for exact posterior densities (importance sampling) are used to facilitate finite sample investigations of the average true score, individual true scores, and various probabilities of interest. A simulation study suggests that, when the examinees come from two different populations, the exponential family can adequately generalize Duncan's betabinomial model. Extensions to regression models, the classical test theory model, and empirical Bayes estimation problems are mentioned, The Duncan, Keats, and Matsumura data sets are used to illustrate potential advantages and flexibility of the exponential family model, and the BCM technique. Key words: multiple choice test, exponential family, likelihood, Akaike's information criterion, generalized linear model, Bayesian marginalization, importance sampling, regression of true score upon observed score, classical test theory model. 1.
DOI 10.1007/s1044000892128 Estimation Methods for the Multivariate t Distribution
"... Abstract The known estimation and simulation methods for multivariate t distributions are reviewed. A review of selected applications is also provided. We believe that this review will serve as an important reference and encourage further research activities in the area. ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract The known estimation and simulation methods for multivariate t distributions are reviewed. A review of selected applications is also provided. We believe that this review will serve as an important reference and encourage further research activities in the area.
BAYESIAN ANALYSIS OF THE ADDITIVE MIXED MODEL FOR RANDOMIZED BLOCK DESIGNS
"... This paper deals with the Bayesian analysis of the additive mixed model experiments. Consider b randomly chosen subjects who respond once to each of t treatments. The subjects are treated as random effects and the treatment effects are fixed. Suppose that some prior information is available, thus mo ..."
Abstract
 Add to MetaCart
This paper deals with the Bayesian analysis of the additive mixed model experiments. Consider b randomly chosen subjects who respond once to each of t treatments. The subjects are treated as random effects and the treatment effects are fixed. Suppose that some prior information is available, thus motivating a Bayesian analysis. The Bayesian computation, however, can be difficult in this situation, especially when a large number of treatments is involved. Three computational methods are suggested to perform the analysis. The exact posterior density of any parameter of interest can be simulated based on random realizations taken from a restricted multivariate t distribution. The density can also be simulated using Markov chain Monte Carlo methods. The simulated density is accurate when a large number of random realizations is taken. However, it may take substantial amount of computer time when many treatments are involved. An alternative Laplacian approximation is discussed. The Laplacian method produces smooth and very accurate approximates to posterior densities, and takes only seconds of computer time. An example of a pipeline cracks experiment is used to illustrate the Bayesian approaches and the computational methods. Key words: Laplacian approximation; Monte Carlo simulation. 1.
Joint modeling a primary endpoint and longitudinal data
"... In many studies the association of longitudinal measurements of a continuous response and a primary endpoint are often of interest. A convenient framework for this type of problems is joint models, which are formulated to investigate the association between a primary endpoint and features of longit ..."
Abstract
 Add to MetaCart
In many studies the association of longitudinal measurements of a continuous response and a primary endpoint are often of interest. A convenient framework for this type of problems is joint models, which are formulated to investigate the association between a primary endpoint and features of longitudinal measurements through a common set of latent random effects. The joint model, which is the focus of this article, is a logistic regression model with covariates defined as the individual–specific random effects in a non–linear random effects model for the longitudinal measurements. We discuss different estimation procedures, which include twostage, best linear unbiased predictors, and various numerical integration techniques. The proposed methods are illustrated using a real dataset where the objective is to study the association between longitudinal hormone levels and the pregnancy outcome 1 in a group of young women. The numerical performance of the estimating methods is also evaluated by means of simulation. Key Words: Best linear unbiased predictor (BLUP); Gaussian quadrature methods; Laplace approximation; Logistic regression model; Non–linear mixed effects models; Two–stage estimator.
Summary Bayesian Inference for Categorical Data Analysis
"... This article surveys Bayesian methods for categorical data analysis, with primary emphasis on contingency table analysis. Early innovations were proposed by Good (1953, 1956, 1965) for smoothing proportions in contingency tables and by Lindley (1964) for inference about odds ratios. These approache ..."
Abstract
 Add to MetaCart
This article surveys Bayesian methods for categorical data analysis, with primary emphasis on contingency table analysis. Early innovations were proposed by Good (1953, 1956, 1965) for smoothing proportions in contingency tables and by Lindley (1964) for inference about odds ratios. These approaches primarily used conjugate beta and Dirichlet priors. Altham (1969, 1971) presented Bayesian analogs of smallsample frequentist tests for 2×2 tables using such priors. An alternative approach using normal priors for logits received considerable attention in the 1970s by Leonard and others (e.g., Leonard 1972). Adopted usually in a hierarchical form, the logitnormal approach allows greater flexibility and scope for generalization. The 1970s also saw considerable interest in loglinear modeling. The advent of modern computational methods since the mid1980s has led to a growing literature on fully Bayesian analyses with models for categorical data, with main emphasis on general
Printed in Great Britain Approximations
"... of marginal tail probabilities for a class of smooth functions with applications to Bayesian and conditional inference ..."
Abstract
 Add to MetaCart
of marginal tail probabilities for a class of smooth functions with applications to Bayesian and conditional inference