Results 1  10
of
13
Root n Consistent and Optimal Density Estimators for Moving Average Processes
"... The marginal density of a first order moving average process can be written as convolution of two innovation densities. Saavedra and Cao (2000) propose to estimate the marginal density by plugging in kernel density estimators for the innovation densities, based on estimated innovations. They obtain ..."
Abstract

Cited by 21 (17 self)
 Add to MetaCart
The marginal density of a first order moving average process can be written as convolution of two innovation densities. Saavedra and Cao (2000) propose to estimate the marginal density by plugging in kernel density estimators for the innovation densities, based on estimated innovations. They obtain that for an appropriate choice of bandwidth the variance of their estimator decreases at the rate 1/n. Their estimator can be interpreted as a specific Ustatistic. We suggest a slightly simpli ed Ustatistic as estimator of the marginal density, prove that it is asymptotically normal at the same rate, and describe the asymptotic variance explicitly. We show that the estimator is asymptotically efficient if no structural assumptions are made on the innovation density. For innovation densities known to have mean zero or to be symmetric, we describe improvements of our estimator which are again asymptotically efficient.
Estimating the Innovation Distribution in Nonlinear Autoregressive Models
 Department of Mathematics, University of Siegen. http://www.math.unisiegen.de/statistik/wefelmeyer.html
"... The usual estimator for the expectation of a function under the innovation distribution of a nonlinear autoregressive model is the empirical estimator based on estimated innovations. It can be improved by exploiting that the innovation distribution has mean zero. We show that the resulting estimator ..."
Abstract

Cited by 17 (17 self)
 Add to MetaCart
The usual estimator for the expectation of a function under the innovation distribution of a nonlinear autoregressive model is the empirical estimator based on estimated innovations. It can be improved by exploiting that the innovation distribution has mean zero. We show that the resulting estimator is efficient if the innovations are estimated with an efficient estimator for the autoregression parameter. Efficiency of this estimator is necessary except when the expectation of the function can be estimated adaptively. Analogous results hold for heteroscedastic models.
Estimating invariant laws of linear processes by Ustatistics
"... Suppose we observe an invertible linear process with independent mean zero innovations, and with coefficients depending on a finitedimensional... ..."
Abstract

Cited by 11 (10 self)
 Add to MetaCart
Suppose we observe an invertible linear process with independent mean zero innovations, and with coefficients depending on a finitedimensional...
PlugIn Estimators In Semiparametric Stochastic Process Models
"... Consider a locally asymptotically normal semiparametric model with a real parameter # and a possibly infinitedimensional parameter F . We are interested in estimating a realvalued functional a(F ). If ^ a # estimates a(F ) for known #, and ^ # estimates #, then the plugin estimator ^ a ^ # estima ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
Consider a locally asymptotically normal semiparametric model with a real parameter # and a possibly infinitedimensional parameter F . We are interested in estimating a realvalued functional a(F ). If ^ a # estimates a(F ) for known #, and ^ # estimates #, then the plugin estimator ^ a ^ # estimates a(F ) if # is unknown. We show that ^ a ^ # is asymptotically linear and regular if ^ a # and ^ # are, and calculate the inuence function and the asymptotic variance of ^ a ^ # . If a(F ) can be estimated adaptively with respect to #, then ^ a ^ # is efficient if ^ a # is efficient. If a(F ) cannot be estimated adaptively, then for ^ a ^ # to be efficient, ^ # must also be efficient. We illustrate the results with stochastic process models, in particular with time series models, and discuss extensions of the results.
Asymptotics for nonparametric regression
 Sankhyā A
, 1993
"... the error distribution function in ..."
Efficient estimation of linear functionals of a bivariate distribution with equal, but unknown marginals: the leastsquares approach
 Journal of Multivariate Analysis
, 2005
"... In this paper we characterize and construct ecient estimators of linear functionals of a bivariate distribution with equal marginals. An ecient estimator equals the empirical estimator minus a correction term and provides signicant improvements over the empirical estimator. We construct an ecient es ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
In this paper we characterize and construct ecient estimators of linear functionals of a bivariate distribution with equal marginals. An ecient estimator equals the empirical estimator minus a correction term and provides signicant improvements over the empirical estimator. We construct an ecient estimator by estimating the correction term. For this we use the least squares principle and an estimated orthonormal basis for the Hilbert space of squareintegrable functions under the unknown equal marginal distribution. Simulations conrm the asymptotic behavior of this estimator in moderate sample sizes and the considerable theoretical gains over the empirical estimator. 1. Introduction. Let (X1; Y1); : : : ; (Xn; Yn) be independent copies of a bivariate random vector (X;Y) with distribution Q. Let be a measurable function from R2 to R such that R
Estimators for Models with Constraints Involving Unknown Parameters
"... Suppose we have independent observations from a distribution which we know to fulll a finitedimensional linear constraint involving an unknown finitedimensional parameter. We construct efficient estimators for finitedimensional functionals of the distribution. The estimators are obtained by first ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
Suppose we have independent observations from a distribution which we know to fulll a finitedimensional linear constraint involving an unknown finitedimensional parameter. We construct efficient estimators for finitedimensional functionals of the distribution. The estimators are obtained by first constructing an efficient estimator for the functional when the parameter is known, and then replacing the parameter by an efficient estimator. We consider in particular estimation of expectations.
On Asymptotic Differentiability Of Averages
, 1999
"... This paper proves an asymptotic expansion useful in the construction of estimates with a prescribed influence function in parametric and semiparametric models. ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
This paper proves an asymptotic expansion useful in the construction of estimates with a prescribed influence function in parametric and semiparametric models.
The Accelerated Failure Time Model under Cross Sectional Sampling Schemes
 University
"... ter verkrijging van de graad van doctor aan de Universiteit van Amsterdam op gezag van de Rector Magnificus prof. dr. J.W. Zwemmer ten overstaan van een door het college voor promoties ingestelde commissie, in het openbaar te verdedigen in de Aula der Universiteit ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
ter verkrijging van de graad van doctor aan de Universiteit van Amsterdam op gezag van de Rector Magnificus prof. dr. J.W. Zwemmer ten overstaan van een door het college voor promoties ingestelde commissie, in het openbaar te verdedigen in de Aula der Universiteit
CHAPTER 1 EFFICIENT ESTIMATORS FOR TIME SERIES
"... We illustrate several recent results on efficient estimation for semiparametric time series models with a simple class of models: firstorder nonlinear autoregression with independent innovations. We consider in particular estimation of the autoregression parameter, the innovation distribution, ..."
Abstract
 Add to MetaCart
We illustrate several recent results on efficient estimation for semiparametric time series models with a simple class of models: firstorder nonlinear autoregression with independent innovations. We consider in particular estimation of the autoregression parameter, the innovation distribution, conditional expectations, the stationary distribution, the stationary density, and higherorder transition densities. 1.