A Monte Carlo Markov Chain Algorithm for a Class of Mixture Time Series Models
by John Lau
Abstract: In this talk, a generalization of the Monte Carlo Markov Chain (MCMC) algorithm, which is developed based on the Gibbs weighted Chinese restaurant (gWCR) process algorithm, for a class of kernel mixture of time series models over the Dirichlet process will be given.
This class of models is an extension of Lo's (1984) kernel mixture model for independent observations. The kernel represents a known distribution of time series conditional on past time series and both present and past latent variables. The latent variables are independent samples from a Dirichlet process, which is a random discrete (almost surely) distribution. This class of models includes an infinite mixture of autoregressive processes and an infinite mixture of generalized autoregressive conditional heteroskedasticity (GARCH) processes.
Evaluating estimates of such models involves sampling partitions of n integers and sampling unique values of latent variables. Unfortunately, existing algorithms for the kernel mixture models are not applicable because of the dependencies among the latent variables through the likelihood of the time series models. We contribute by generalizing existing algorithms for mixture time series models using the reseating idea of the gWCR process, which was originally from the Polya Urn sampling scheme. Our methodology is illustrated by volatility estimations of ten financial indices fitted to an infinite mixture of GARCH models. An extension to more general random probability measures such as two-parameter Poisson-Dirichlet processes and normalized generalized Gamma processes is also discussed.
For More Information: Contact: Owen Jones. email. firstname.lastname@example.org