Dirichlet Process Mixture Models For Markov Processes
dc.contributor.advisor | Subhashis Ghosal, Committee Chair | en_US |
dc.contributor.advisor | Anastasios Tsiatis, Committee Member | en_US |
dc.contributor.advisor | Peter Bloomfield, Committee Member | en_US |
dc.contributor.advisor | Bibhuti B. Bhattacharyya, Committee Member | en_US |
dc.contributor.author | Tang, Yongqiang | en_US |
dc.date.accessioned | 2010-04-02T18:48:46Z | |
dc.date.available | 2010-04-02T18:48:46Z | |
dc.date.issued | 2003-12-03 | en_US |
dc.degree.discipline | Statistics | en_US |
dc.degree.level | dissertation | en_US |
dc.degree.name | PhD | en_US |
dc.description.abstract | Prediction of the future observations is an important practical issue for statisticians. When the data can be viewed as exchangeable, de Finneti's theorem concludes that, conditionally, the data can be modeled as independent and identically distributed (i.i.d.). The predictive distribution of the future observations given the present data is then given by the posterior expectation of the underlying density function given the observations. The Dirichlet process mixture of normal densities has been successfully used as a prior in the Bayesian density estimation problem. However, when the data arise over time, exchangeability, and therefore the conditional i.i.d. structure in the data is questionable. A conditional Markov model may be thought of as a more general, yet having sufficiently rich structure suitable for handling such data. The predictive density of the future observation is then given by the posterior expectation of the transition density given the observations. We propose a Dirichlet process mixture prior for the problem of Bayesian estimation of transition density. Appropriate Markov chain Monte Carlo (MCMC) algorithm for the computation of posterior expectation will be discussed. Because of an inherent non-conjugacy in the model, usual Gibbs sampling procedure used for the density estimation problem is hard to implement. We propose using the recently proposed "no-gaps algorithm" to overcome the difficulty. When the Markov model holds good, we show the consistency of the Bayes procedures in appropriate topologies by constructing appropriate uniformly exponentially consistent tests and extending the idea of Schwartz (1965) to Markov processes. Numerical examples show excellent agreement between asymptotic theory and the finite sample behavior of the posterior distribution. | en_US |
dc.identifier.other | etd-12032003-102009 | en_US |
dc.identifier.uri | http://www.lib.ncsu.edu/resolver/1840.16/4196 | |
dc.rights | I hereby certify that, if appropriate, I have obtained and attached hereto a written permission statement from the owner(s) of each third party copyrighted matter to be included in my thesis, dissertation, or project report, allowing distribution as specified below. I certify that the version I submitted is the same as that approved by my advisory committee. I hereby grant to NC State University or its agents the non-exclusive license to archive and make accessible, under the conditions specified below, my thesis, dissertation, or project report in whole or in part in all forms of media, now or hereafter known. I retain all other ownership rights to the copyright of the thesis, dissertation or project report. I also retain the right to use in future works (such as articles or books) all or part of this thesis, dissertation, or project report. | en_US |
dc.subject | posterior consistency | en_US |
dc.subject | time series | en_US |
dc.subject | Poisson equation | en_US |
dc.subject | no-gaps algorithm | en_US |
dc.subject | Markov process | en_US |
dc.subject | Dirichlet process | en_US |
dc.subject | uniformly exponentially consistent tests | en_US |
dc.title | Dirichlet Process Mixture Models For Markov Processes | en_US |
Files
Original bundle
1 - 1 of 1