Results 1  10
of
20
Is Bayes posterior just quick and dirty confidence?
, 2010
"... Bayes (1763) introduced the observed likelihood function to statistical inference and provided a weight function to calibrate the parameter; he also introduced a confidence distribution on the parameter space but restricted attention to models now called location models; of course the names likeliho ..."
Abstract

Cited by 25 (5 self)
 Add to MetaCart
Bayes (1763) introduced the observed likelihood function to statistical inference and provided a weight function to calibrate the parameter; he also introduced a confidence distribution on the parameter space but restricted attention to models now called location models; of course the names likelihood and confidence did not appear until much later: Fisher (1922) for likelihood and Neyman (1937) for confidence. Lindley (1958) showed that the Bayes and the confidence results were different when the model was not location. This paper examines the occurrence of true statements from the Bayes approach and from the confidence approach, and shows that the proportion of true statements in the Bayes case depends critically on the presence of linearity in the model; and with departure from this linearity the Bayes approach can be seriously misleading. Bayesian integration of weighted likelihood provides a first order linear approximation to confidence, but without linearity can give substantially incorrect results.
Inferential models: A framework for priorfree posterior probabilistic inference
 J. Amer. Statist. Assoc
, 2013
"... ar ..."
(Show Context)
A note on pvalues interpreted as plausibilities
 Statist. Sinica
, 2014
"... ar ..."
(Show Context)
The bootstrap and Markov chain Monte Carlo
"... This note concerns the use of parametric bootstrap sampling to carry out Bayesian inference calculations. This is only possible in a subset of those problems amenable to MCMC analysis, but when feasible the bootstrap approach offers both computational and theoretical advantages. The discussion here ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
This note concerns the use of parametric bootstrap sampling to carry out Bayesian inference calculations. This is only possible in a subset of those problems amenable to MCMC analysis, but when feasible the bootstrap approach offers both computational and theoretical advantages. The discussion here is in terms of a simple example, with no attempt at a general analysis.
Bayesian Analysis or Evidence Based Statistics
"... The original Bayes proposal leads to likelihood and confidence for many simple examples. More generally it gives approximate confidence but to achieve exact confidence reliability it needs refinement of the argument and needs more than just the usual minimum of the likelihood function from observed ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
The original Bayes proposal leads to likelihood and confidence for many simple examples. More generally it gives approximate confidence but to achieve exact confidence reliability it needs refinement of the argument and needs more than just the usual minimum of the likelihood function from observed data. A general
Coherent frequentism
, 2009
"... By representing the range of fair betting odds according to a pair of confidence set estimators, dual probability measures on parameter space called frequentist posteriors secure the coherence of subjective inference without any prior distribution. The closure of the set of expected losses correspon ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
By representing the range of fair betting odds according to a pair of confidence set estimators, dual probability measures on parameter space called frequentist posteriors secure the coherence of subjective inference without any prior distribution. The closure of the set of expected losses corresponding to the dual frequentist posteriors constrains decisions without arbitrarily forcing optimization under all circumstances. This decision theory reduces to those that maximize expected utility when the pair of frequentist posteriors is induced by an exact or approximate confidence set estimator or when an automatic reduction rule is applied to the pair. In such cases, the resulting frequentist posterior is coherent in the sense that, as a probability distribution of the parameter of interest, it satisfies the axioms of the decisiontheoretic and logictheoretic systems typically cited in support of the Bayesian posterior. Unlike the pvalue, the confidence level of an interval hypothesis derived from such a measure is suitable as an estimator of the indicator of hypothesis truth since it converges in samplespace probability to 1 if the hypothesis is true or to 0 otherwise under general conditions.
Vector exponential models and second order analysis
 Pakistan Journal Statistics Operation Research
, 2012
"... For an exponential model with scalar parameter, WelchP:1963 examined the role of Bayesian analysis in statistical inference, more specifically the use of the Jeffreys:1946 prior. They determined that Bayesian intervals and thus in effect Bayesian quantiles had second order confidence accuracy. We us ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
For an exponential model with scalar parameter, WelchP:1963 examined the role of Bayesian analysis in statistical inference, more specifically the use of the Jeffreys:1946 prior. They determined that Bayesian intervals and thus in effect Bayesian quantiles had second order confidence accuracy. We use a Taylor Pak.j.stat.oper.res. Vol.VIII No.3 2012 pp293299 D.A.S. Fraser, Uyen Hoang, KexinJi, Xufei Li, Li Li, Wei Lin, Jie Su. series expansion of the logmodel to develop a second order version of the vector exponential model; this is developed as a contribution to theory in statistics at a time when algorithms are prominent, and it provides a basis for generalizing the WelchPeers approach to the vector parameter context.
PARAMETER CURVATURE REVISITED AND THE BAYESFREQUENTIST DIVERGENCE.
"... Parameter curvature was introduced by Efron (1975) for classifying curved exponential models. We develop an alternative definition that describes curvature relative to location models. This modified curvature calibrates how Bayes posterior probability differs from familiar frequency based probabilit ..."
Abstract
 Add to MetaCart
Parameter curvature was introduced by Efron (1975) for classifying curved exponential models. We develop an alternative definition that describes curvature relative to location models. This modified curvature calibrates how Bayes posterior probability differs from familiar frequency based probability. And it provides a basis for then correcting Bayes probabilities to agree with the reproducibility traditional to mainstream statistics. The two curvatures are compared and examples are given.
Submitted to the Brazilian Journal of Probability and Statistics arXiv: math.PR/0000000 On default priors and approximate location models
"... a ..."
(Show Context)