Results 1 -
3 of
3
On Monte Carlo methods for Bayesian multivariate regression models with heavy-tailed errors
- Journal of Multivariate Analysis
"... We consider Bayesian analysis of data from multivariate linear regression models whose errors have a distribution that is a scale mixture of normals. Such models are used to analyze data on financial returns, which are notoriously heavy-tailed. Let pi denote the intractable posterior density that re ..."
Abstract
-
Cited by 7 (3 self)
- Add to MetaCart
We consider Bayesian analysis of data from multivariate linear regression models whose errors have a distribution that is a scale mixture of normals. Such models are used to analyze data on financial returns, which are notoriously heavy-tailed. Let pi denote the intractable posterior density that results when this regression model is combined with the standard non-informative prior on the unknown regression coefficients and scale matrix of the errors. Roughly speaking, the posterior is proper if and only if n ≥ d + k, where n is the sample size, d is the dimension of the response, and k is number of covariates. We provide a method of making exact draws from pi in the special case where n = d + k, and we study Markov chain Monte Carlo (MCMC) algorithms that can be used to explore pi when n> d+ k. In particular, we show how the Haar PX-DA technology studied in Hobert and Marchev (2008) can be used to improve upon Liu’s (1996) data augmentation (DA) algorithm. Indeed, the new algorithm that we introduce is theoretically superior to the DA algorithm, yet equivalent to DA in terms of computational complexity. Moreover, we analyze the convergence rates of these MCMC algorithms in the important special case where the regression errors have a Student’s t distribution. We prove that, under conditions on n, d, k, and the degrees of freedom of the t distribution, both algorithms converge at a geometric rate. These convergence rate results are important from a practical standpoint because geometric ergodicity guarantees the existence of central limit theorems which are essential for the calculation of valid asymptotic standard errors for MCMC based estimates.
Geometric convergence of the Haar PX-DA algorithm for the Bayesian multivariate regression model with Student t errors
, 2009
"... We consider Bayesian analysis of data from multivariate linear regression models whose errors have a distribution that is a scale mixture of normals. Such models are used to analyze data on financial returns, which are notoriously heavy-tailed. Let π denote the intractable posterior density that res ..."
Abstract
- Add to MetaCart
We consider Bayesian analysis of data from multivariate linear regression models whose errors have a distribution that is a scale mixture of normals. Such models are used to analyze data on financial returns, which are notoriously heavy-tailed. Let π denote the intractable posterior density that results when this regression model is combined with the standard non-informative prior on the unknown regression coefficients and scale matrix of the errors. Roughly speaking, the posterior is proper if and only if n ≥ d + k, where n is the sample size, d is the dimension of the response, and k is number of covariates. We provide a method of making exact draws from π in the special case where n = d + k, and we study Markov chain Monte Carlo (MCMC) algorithms that can be used to explore π when n> d + k. In particular, we show how the Haar PX-DA technology of Hobert and Marchev (2008) can be used to improve upon Liu’s (1996) data augmentation (DA) algorithm. Indeed, the new algorithm that we introduce is theoretically superior to the DA algorithm, yet equivalent to DA in terms of computational complexity. Moreover, we analyze the convergence rates of these MCMC algorithms in the important special case where the regression errors have a Student’s t distribution. We prove that, under conditions on n, d, k, and the degrees of freedom of the t distribution, both algorithms converge at a geometric rate. These convergence rate results are important from a practical standpoint because geometric ergodicity guarantees the existence of central limit theorems which are essential for the calculation of valid asymptotic standard errors for MCMC based estimates.
To my parents
, 2008
"... ACKNOWLEDGMENTS I extend my sincerest thanks to my advisor Jim Hobert for his guidance throughout my graduate study at University of Florida. I feel fortunate to have Jim as my PhD advisor. His guidance, help, enthusiasm, were all crucial to making this thesis take its current shape. I’m deeply grat ..."
Abstract
- Add to MetaCart
ACKNOWLEDGMENTS I extend my sincerest thanks to my advisor Jim Hobert for his guidance throughout my graduate study at University of Florida. I feel fortunate to have Jim as my PhD advisor. His guidance, help, enthusiasm, were all crucial to making this thesis take its current shape. I’m deeply grateful to him for many other things, not the least for his inspiring words in my hours of need. I would also like to thank Professors Ben Bolker, Hani Doss and Brett Presnell for agreeing to serve on my committee. I am particularly grateful to Professors Hani Doss and Brett Presnell for being so kind to me over the past five years. I learned not only statistics but also a lot about Emacs, LaTex and R from them. I would also like to thank Professors Bob Dorazio, Malay Ghosh and Andrew Rosalsky for sparing a lot of their valuable time on academic discussions with me and giving me advice on several issues. I thank all my teachers from school, college and Indian Statistical Institute whose dedication to teaching and quest for knowledge have inspired me to pursue higher study. Special thanks go to Ananyadi and Parag for their friendship, care and support. I have learned a lot about life in past five years from both of them. I owe deep gratitude to Jethima whose care and affection I will never forget. I am thankful to Shuva whose love and enthusiasm for mathematics have always inspired me. I am indebted to many other people, mostly from my village, who guided me and encouraged me during the formative years of my life: Arunda, Arunkaku, Bapida, Bomkeshjethu, Budhujethu, Shashankajethu and my uncle. Finally, I would like to thank my parents for always being a driving force in my life. I often feel that whatever I have achieved is only due to my parents ’ sacrifice, hard work and honesty. 4