Results 1  10
of
53
Testing for Common Trends
 Journal of the American Statistical Association
, 1988
"... Cointegrated multiple time series share at least one common trend. Two tests are developed for the number of common stochastic trends (i.e., for the order of cointegration) in a multiple time series with and without drift. Both tests involve the roots of the ordinary least squares coefficient matrix ..."
Abstract

Cited by 464 (7 self)
 Add to MetaCart
firstorder autocorrelation matrix, where the correction is essentially a sum of the autocovariance matrices. Previous researchers have found that U.S. postwar interest rates, taken individually, appear to be integrated of order 1. In addition, the theory of the term structure implies that yields
NOTES AND PROBLEMS LONGRUN COVARIANCE MATRICES FOR FRACTIONALLY INTEGRATED PROCESSES
, 2007
"... An asymptotic expansion is given for the autocovariance matrix of a vector of stationary longmemory processes with memory parameters d � ..."
Abstract
 Add to MetaCart
An asymptotic expansion is given for the autocovariance matrix of a vector of stationary longmemory processes with memory parameters d �
Banding sample autocovariance matrices of stationary processes
 Statistica Sinica
, 2009
"... Abstract: We consider estimation of covariance matrices of stationary processes. Under a shortrange dependence condition for a wide class of nonlinear processes, it is shown that the banded covariance matrix estimates converge in operator norm to the true covariance matrix with explicit rates of co ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
Abstract: We consider estimation of covariance matrices of stationary processes. Under a shortrange dependence condition for a wide class of nonlinear processes, it is shown that the banded covariance matrix estimates converge in operator norm to the true covariance matrix with explicit rates
Efficient Computation of Linearized CrossCovariance and AutoCovariance Matrices of Interdependent Quantities
, 2003
"... In many geostatistical applications, spatially discretized unknowns are conditioned on observations that depend on the unknowns in a form that can be linearized. Conditioning takes several matrix–matrix multiplications to compute the crosscovariance matrix of the unknowns and the observations and t ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
and the autocovariance matrix of the observations. For large numbers n of discrete values of the unknown, the storage and computational costs for evaluating these matrices, proportional to n2, become strictly inhibiting. In this paper, we summarize and extend a collection of highly efficient spectral methods
Generalized Linear Random Vibration Analysis Using AutoCovariance Orthogonal Decomposition
"... Application of a stationary Gaussian Random process to describe a nondeterministic forcing function of a linear vibrating system is well studied and documented. Two algorithms: 1) KL Expansion Method, and 2) Collocation Technique for nonstationary and nonGaussian forcing processes (narrow or bro ..."
Abstract
 Add to MetaCart
) and Collocation Technique (discretized covariance matrix) are used to get the eigenvalues and the eigenvectors of the autocovariance function, numerically. The steadystate and the transient response of a singledegreeoffreedom (SDOF) system for an exponential autocovariance (Gaussian random process
Identifying the number of factors from singular values of a large sample autocovariance matrix
"... Abstract: Identifying the number of factors in a highdimensional factor model has attracted much attention in recent years and a general solution to the problem is still lacking. A promising ratio estimator based on the singular values of the lagged autocovariance matrix has been recently proposed ..."
Abstract
 Add to MetaCart
Abstract: Identifying the number of factors in a highdimensional factor model has attracted much attention in recent years and a general solution to the problem is still lacking. A promising ratio estimator based on the singular values of the lagged autocovariance matrix has been recently
Long Run Covariance Matrices for Fractionally Integrated Processes
, 2007
"... An asymptotic expansion is given for the autocovariance matrix of a vector of stationary longmemory processes with memory parameters d ∈ [0, 1/2). The theory is then applied to deliver formulae for the long run covariance matrices of multivariate time series with long memory. ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
An asymptotic expansion is given for the autocovariance matrix of a vector of stationary longmemory processes with memory parameters d ∈ [0, 1/2). The theory is then applied to deliver formulae for the long run covariance matrices of multivariate time series with long memory.
Submitted to the Annals of Applied Probability ON SINGULAR VALUE DISTRIBUTION OF LARGEDIMENSIONAL AUTOCOVARIANCE MATRICES
"... Let (εj)j≥0 be a sequence of independent p−dimensional random vectors and τ ≥ 1 a given integer. From a sample ε1, · · · , εT+τ−1, εT+τ of the sequence, the socalled lag−τ autocovariance matrix is Cτ = T−1 ∑T j=1 ετ+jε ..."
Abstract
 Add to MetaCart
Let (εj)j≥0 be a sequence of independent p−dimensional random vectors and τ ≥ 1 a given integer. From a sample ε1, · · · , εT+τ−1, εT+τ of the sequence, the socalled lag−τ autocovariance matrix is Cτ = T−1 ∑T j=1 ετ+jε
Correction to “Banded and tapered estimates of autocovariance matrices and the linear process bootstrap”, J. Time Ser. Anal., vol.
, 2011
"... In this note, we correct the proof and statement of Theorem 5 in McMurry and Politis (2010), which establishes the consistency of the linear process bootstrap (LPB) for the sample mean. The statement of Lemma 6, on which Theorem 5 relies, is in error. Lemma 6 is used to bound the operator norm ρ(A 1 ..."
Abstract
 Add to MetaCart
(A 1/2 − B 1/2) by a bounded factor times the operator norm ρ(A−B), where A 1/2 and B 1/2 are taken to be the lower triangular Cholesky factors of A and B. The result was erroneously thought to be an extension of results in Horn and Johnson (1990) for the matrix square root given by the spectral
A Simple Nonlinear Filter for Edge Detection in Images
"... We specialize to two simple cases the algorithm for singularity detection in images from eigenvalues of the dual local autocovariance matrix. The eigenvalue difference, or “edginess ” at a point, then reduces to a simple nonlinear function. We discuss the derivation of these functions, which provide ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We specialize to two simple cases the algorithm for singularity detection in images from eigenvalues of the dual local autocovariance matrix. The eigenvalue difference, or “edginess ” at a point, then reduces to a simple nonlinear function. We discuss the derivation of these functions, which
Results 1  10
of
53