Results 1 
5 of
5
Root n Consistent and Optimal Density Estimators for Moving Average Processes
"... The marginal density of a first order moving average process can be written as convolution of two innovation densities. Saavedra and Cao (2000) propose to estimate the marginal density by plugging in kernel density estimators for the innovation densities, based on estimated innovations. They obtain ..."
Abstract

Cited by 21 (17 self)
 Add to MetaCart
The marginal density of a first order moving average process can be written as convolution of two innovation densities. Saavedra and Cao (2000) propose to estimate the marginal density by plugging in kernel density estimators for the innovation densities, based on estimated innovations. They obtain that for an appropriate choice of bandwidth the variance of their estimator decreases at the rate 1/n. Their estimator can be interpreted as a specific Ustatistic. We suggest a slightly simpli ed Ustatistic as estimator of the marginal density, prove that it is asymptotically normal at the same rate, and describe the asymptotic variance explicitly. We show that the estimator is asymptotically efficient if no structural assumptions are made on the innovation density. For innovation densities known to have mean zero or to be symmetric, we describe improvements of our estimator which are again asymptotically efficient.
Weighted ResidualBased Density Estimators For Nonlinear Autoregressive Models
"... This paper considers residualbased and randomly weighted kernel estimators for innovation densities of nonlinear autoregressive models. The weights are chosen to make use of the information that the innovations have mean zero. Rates of convergence are obtained in weighted L1norms. These estimators ..."
Abstract

Cited by 18 (13 self)
 Add to MetaCart
This paper considers residualbased and randomly weighted kernel estimators for innovation densities of nonlinear autoregressive models. The weights are chosen to make use of the information that the innovations have mean zero. Rates of convergence are obtained in weighted L1norms. These estimators give rise to smoothed and weighted empirical distribution functions and moments. It is shown that the latter are efficient if an efficient estimator for the autoregression parameter is used to construct the residuals.
Estimating invariant laws of linear processes by Ustatistics
"... Suppose we observe an invertible linear process with independent mean zero innovations, and with coefficients depending on a finitedimensional... ..."
Abstract

Cited by 11 (10 self)
 Add to MetaCart
Suppose we observe an invertible linear process with independent mean zero innovations, and with coefficients depending on a finitedimensional...
Improved estimators for constrained Markov chain models
"... Suppose we observe an ergodic Markov chain and know that the stationary law of one or two successive observations fullls a linear constraint. We show how to improve given estimators exploiting this knowledge, and prove that the best of these estimators is efficient. ..."
Abstract

Cited by 7 (7 self)
 Add to MetaCart
Suppose we observe an ergodic Markov chain and know that the stationary law of one or two successive observations fullls a linear constraint. We show how to improve given estimators exploiting this knowledge, and prove that the best of these estimators is efficient.
Improved estimators for constrained Markov chain models
, 2001
"... Suppose we observe an ergodic Markov chain and know that the stationary law of one or two successive observations ful0lls a linear constraint. We show how to improve the given estimators exploiting this knowledge, and prove that the best of these estimators is e2cient. c © 2001 Elsevier Science B.V. ..."
Abstract
 Add to MetaCart
Suppose we observe an ergodic Markov chain and know that the stationary law of one or two successive observations ful0lls a linear constraint. We show how to improve the given estimators exploiting this knowledge, and prove that the best of these estimators is e2cient. c © 2001 Elsevier Science B.V. All rights reserved MSC: primary 62M05; secondary 62G05; 62G20