Results 1 -
3 of
3
FITTING STOCHASTIC VOLATILITY MODELS IN THE PRESENCE OF IRREGULAR SAMPLING VIA PARTICLE METHODS AND THE EM ALGORITHM
, 2008
"... Stochastic volatility (SV) models have become increasingly popular for explaining the behaviour of financial variables such as stock prices and exchange rates, and their popularity has resulted in several different proposed approaches to estimating the parameters of the model. An important feature ..."
Abstract
-
Cited by 3 (0 self)
- Add to MetaCart
Stochastic volatility (SV) models have become increasingly popular for explaining the behaviour of financial variables such as stock prices and exchange rates, and their popularity has resulted in several different proposed approaches to estimating the parameters of the model. An important feature of financial data, which is commonly ignored, is the occurrence of irregular sampling because of holidays or unexpected events. We present a method that can handle the estimation problem of SV models when the sampling is somewhat irregular. The basic idea of our approach is to combine the expectation-maximization (EM) algorithm with particle filters and smoothers in order to estimate parameters of the model. In addition, we expand the scope of application of SV models by adopting a normal mixture, with unknown parameters, for the observational error term rather than assuming a log-chi-squared distribution. We address the problems by using state–space models and imputation. Finally, we present simulation studies and real data analyses to establish the viability of the proposed method.
A state space modeling approach to mediation analysis
- Journal of Educational and Behavioral Statistics
"... Mediation is a causal process that evolves over time. Thus, a study of mediation requires data collected throughout the process. However, most applications of mediation analysis use cross-sectional rather than longitudi-nal data. Another implicit assumption commonly made in longitudinal designs for ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
Mediation is a causal process that evolves over time. Thus, a study of mediation requires data collected throughout the process. However, most applications of mediation analysis use cross-sectional rather than longitudi-nal data. Another implicit assumption commonly made in longitudinal designs for mediation analysis is that the same mediation process universally applies to all members of the population under investigation. This assumption ignores the important issue of ergodicity before aggregating the data across subjects. We first argue that there exists a discrepancy between the concept of media-tion and the research designs that are typically used to investigate it. Second, based on the concept of ergodicity, we argue that a given mediation process probably is not equally valid for all individuals in a population. Therefore, the purpose of this article is to propose a two-faceted solution. The first facet of the solution is that we advocate a single-subject time-series design that aligns data collection with researchers ’ conceptual understanding of mediation. The second facet is to introduce a flexible statistical method—the state space
Using Parametric and Residual-based Bootstrap to Assess the Absolute Goodness- of-fit for State Space Model by
"... ACKNOWLEDGEMENTS I would like to thank Dr. Neal M. Kingston for his supervision throughout the entire process of this doctoral dissertation. I would also like to express my special thanks to Dr. Todd D. Little, the director of the Center for Research Methods and Data Analysis (CRMDA), for providing ..."
Abstract
- Add to MetaCart
ACKNOWLEDGEMENTS I would like to thank Dr. Neal M. Kingston for his supervision throughout the entire process of this doctoral dissertation. I would also like to express my special thanks to Dr. Todd D. Little, the director of the Center for Research Methods and Data Analysis (CRMDA), for providing SAS Unix and the High Performance Computing (HPC) facility, and Dr. Paul Johnson, the