Results 1  10
of
22
Ordering and improving the performance of Monte Carlo Markov chains
 Statistical Science
, 2001
"... Abstract. An overview of orderings defined on the space of Markov chains having a prespecified unique stationary distribution is given. The intuition gained by studying these orderings is used to improve existing Markov chain Monte Carlo algorithms. Key words and phrases: Asymptotic variance, conver ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
Abstract. An overview of orderings defined on the space of Markov chains having a prespecified unique stationary distribution is given. The intuition gained by studying these orderings is used to improve existing Markov chain Monte Carlo algorithms. Key words and phrases: Asymptotic variance, convergence ordering,
QuasiLikelihood Models and Optimal Inference
 Ann. Statist
"... Consider an ergodic Markov chain on the real line, with parametric models for the conditional mean and variance of the transition distribution. Such a setting is an instance of a quasilikelihood model. The customary estimator for the parameter is the maximum quasilikelihood estimator. It is not ef ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
Consider an ergodic Markov chain on the real line, with parametric models for the conditional mean and variance of the transition distribution. Such a setting is an instance of a quasilikelihood model. The customary estimator for the parameter is the maximum quasilikelihood estimator. It is not efficient, but as good as the best estimator that ignores the parametric model for the conditional variance. We construct two efficient estimators. One is a convex combination of solutions of two estimating equations, the other a weighted nonlinear onestep least squares estimator, with weights involving predictors for the third and fourth centered conditional moments of the transition distribution. Additional restrictions on the model can lead to further improvement. We illustrate this with an autoregressive model whose error variance is related to the autoregression parameter. 1 Introduction According to Wedderburn (1974), a quasilikelihood model is defined by a relation between mean and v...
Application of Convolution Theorems in Semiparametric Models with noni.i.d. Data
"... A useful approach to asymptotic e ciency for estimators in semiparametric models is the study of lower bounds on asymptotic variances via convolution theorems. Such theorems are often applicable in models in which the classical assumptions of independence and identical distributions fail to hold, bu ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
A useful approach to asymptotic e ciency for estimators in semiparametric models is the study of lower bounds on asymptotic variances via convolution theorems. Such theorems are often applicable in models in which the classical assumptions of independence and identical distributions fail to hold, but to date, much of the research has focused on semiparametric models with independent and identically distributed (i.i.d.) data because tools are available in the i.i.d. setting for verifying preconditions of the convolution theorems. We develop tools for noni.i.d. data that are similar in spirit to those for i.i.d. data and also analogous to the approaches used in parametric models with dependent data. This involves extending the notion of the tangent vector guring so prominently in the i.i.d. theory and providing conditions for smoothness, or differentiability, of the parameter of interest as a function of the underlying probability measures. As a corollary to the differentiability result we obtain sufficient conditions for equivalence, in terms of asymptotic variance bounds, of two models. Regularity and asymptotic linearity of estimators are also discussed.
Reversible Markov chains and optimality of symmetrized empirical estimators
 Bernoulli
, 1998
"... Suppose we want to estimate the expectation of a function of two arguments under the stationary distribution of two successive observations of a reversible Markov chain. Then the usual empirical estimator can be improved by symmetrizing. We show that the symmetrized estimator is efficient. We point ..."
Abstract

Cited by 11 (8 self)
 Add to MetaCart
Suppose we want to estimate the expectation of a function of two arguments under the stationary distribution of two successive observations of a reversible Markov chain. Then the usual empirical estimator can be improved by symmetrizing. We show that the symmetrized estimator is efficient. We point out applications to discretely observed continuoustime processes. The proof is based on a result for general Markov chain models which can be used to characterize efficient estimators in any model defined by restrictions on the stationary distribution of a single or two successive observations. 1 Introduction Suppose we observe X 0 ; : : : ; X n from an ergodic Markov chain with unknown transition distribution Q(x; dy) and invariant distribution ß(dx). We want to estimate the expectation of a function f(x; y) under the joint stationary distribution (ß\Omega Q)(dx; dy) = ß(dx)Q(x; dy) of two successive observations. Greenwood and Wefelmeyer (1995) show that the empirical estimator E n f =...
Estimating invariant laws of linear processes by Ustatistics
"... Suppose we observe an invertible linear process with independent mean zero innovations, and with coefficients depending on a finitedimensional... ..."
Abstract

Cited by 11 (10 self)
 Add to MetaCart
Suppose we observe an invertible linear process with independent mean zero innovations, and with coefficients depending on a finitedimensional...
Improved estimators for constrained Markov chain models
"... Suppose we observe an ergodic Markov chain and know that the stationary law of one or two successive observations fullls a linear constraint. We show how to improve given estimators exploiting this knowledge, and prove that the best of these estimators is efficient. ..."
Abstract

Cited by 7 (7 self)
 Add to MetaCart
Suppose we observe an ergodic Markov chain and know that the stationary law of one or two successive observations fullls a linear constraint. We show how to improve given estimators exploiting this knowledge, and prove that the best of these estimators is efficient.
Maximum likelihood estimator and KullbackLeibler information in misspecified Markov chain models
 Teor. Veroyatnost. i Primenen
"... Suppose we have specified a parametric model for the transition distribution of a Markov chain, but that the true transition distribution does not belong to the model. Then the maximum likelihood estimator estimates the parameter which maximizes the KullbackLeibler information between the true tra ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
Suppose we have specified a parametric model for the transition distribution of a Markov chain, but that the true transition distribution does not belong to the model. Then the maximum likelihood estimator estimates the parameter which maximizes the KullbackLeibler information between the true transition distribution and the model. We prove that the maximum likelihood estimator is asymptotically efficient in a nonparametric sense if the true transition distribution is unknown. 1 Introduction Suppose we observe X 0 ; : : : ; X n from an ergodic Markov chain on an arbitrary state space. We have specified a parametric model Q # (x; dy) for the transition distribution, and an initial distribution j 0 (dx). Consider the following two situations: 1. We believe, erroneously, that the model is correct, and use the maximum likelihood estimator for estimating the parameter. 2. We know that the model is incorrect, and want to fit a transition distribution from the model to the true transition ...
Estimating Joint Distributions Of Markov Chains
 IN PREPARATION
, 1998
"... Suppose we observe a stationary Markov chain with unknown transition distribution. The empirical estimator for the expectation of a function of two successive observations is known to be efficient. For reversible Markov chains, an appropriate symmetrization is efficient. For functions of more than ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Suppose we observe a stationary Markov chain with unknown transition distribution. The empirical estimator for the expectation of a function of two successive observations is known to be efficient. For reversible Markov chains, an appropriate symmetrization is efficient. For functions of more than two arguments, these estimators cease to be efficient. We determine the influence function of efficient estimators of expectations of functions of several observations, both for completely unknown and for reversible Markov chains. We construct simple efficient estimators in both cases.
Empirical estimators for semiMarkov processes
 Math. Meth. Statist
, 1996
"... A semiMarkov process stays in state x for a time s and then jumps to state y according to a transition distribution Q(x; dy; ds). A statistical model is described by a family of such transition distributions. We give conditions for a nonparametric version of local asymptotic normality as the obser ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
A semiMarkov process stays in state x for a time s and then jumps to state y according to a transition distribution Q(x; dy; ds). A statistical model is described by a family of such transition distributions. We give conditions for a nonparametric version of local asymptotic normality as the observation time tends to infinity. Then we introduce `empirical' estimators for linear functionals of the distribution ß(dx)Q(x; dy; ds), with ß denoting the invariant distribution of the embedded Markov chain, and characterize the empirical estimators which are efficient for a given model. We discuss efficiency of several classical estimators, in particular the jump frequency, the proportion of visits to a given set, the proportion of time spent in a set, and an estimator for Q (x; fyg \Theta [0; t]) suggested by Moore and Pyke (1968) for countable state space. 1 Introduction A semiMarkov process Y = (Y t ) t0 is a process on the time interval [0; 1), with values in some arbitrary state spac...
Efficient estimation of invariant distributions of some semiparametric Markov chain models
, 1998
"... We characterize efficient estimators for the expectation of a function under the invariant distribution of a Markov chain and outline ways of constructing such estimators. We consider two models. The first is described by a parametric family of constraints on the transition distribution; the second ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
We characterize efficient estimators for the expectation of a function under the invariant distribution of a Markov chain and outline ways of constructing such estimators. We consider two models. The first is described by a parametric family of constraints on the transition distribution; the second is the heteroscedastic nonlinear autoregressive model. The efficient estimator for the first model adds a correction term to the empirical estimator. In the second model, the suggested efficient estimator is a onestep improvement of an initial estimator which might be obtained by a version of Markov chain Monte Carlo.