Dirichlet Process Mixture Models For Markov Processes

dc.contributor.advisorSubhashis Ghosal, Committee Chairen_US
dc.contributor.advisorAnastasios Tsiatis, Committee Memberen_US
dc.contributor.advisorPeter Bloomfield, Committee Memberen_US
dc.contributor.advisorBibhuti B. Bhattacharyya, Committee Memberen_US
dc.contributor.authorTang, Yongqiangen_US
dc.date.accessioned2010-04-02T18:48:46Z
dc.date.available2010-04-02T18:48:46Z
dc.date.issued2003-12-03en_US
dc.degree.disciplineStatisticsen_US
dc.degree.leveldissertationen_US
dc.degree.namePhDen_US
dc.description.abstractPrediction of the future observations is an important practical issue for statisticians. When the data can be viewed as exchangeable, de Finneti's theorem concludes that, conditionally, the data can be modeled as independent and identically distributed (i.i.d.). The predictive distribution of the future observations given the present data is then given by the posterior expectation of the underlying density function given the observations. The Dirichlet process mixture of normal densities has been successfully used as a prior in the Bayesian density estimation problem. However, when the data arise over time, exchangeability, and therefore the conditional i.i.d. structure in the data is questionable. A conditional Markov model may be thought of as a more general, yet having sufficiently rich structure suitable for handling such data. The predictive density of the future observation is then given by the posterior expectation of the transition density given the observations. We propose a Dirichlet process mixture prior for the problem of Bayesian estimation of transition density. Appropriate Markov chain Monte Carlo (MCMC) algorithm for the computation of posterior expectation will be discussed. Because of an inherent non-conjugacy in the model, usual Gibbs sampling procedure used for the density estimation problem is hard to implement. We propose using the recently proposed "no-gaps algorithm" to overcome the difficulty. When the Markov model holds good, we show the consistency of the Bayes procedures in appropriate topologies by constructing appropriate uniformly exponentially consistent tests and extending the idea of Schwartz (1965) to Markov processes. Numerical examples show excellent agreement between asymptotic theory and the finite sample behavior of the posterior distribution.en_US
dc.identifier.otheretd-12032003-102009en_US
dc.identifier.urihttp://www.lib.ncsu.edu/resolver/1840.16/4196
dc.rightsI hereby certify that, if appropriate, I have obtained and attached hereto a written permission statement from the owner(s) of each third party copyrighted matter to be included in my thesis, dissertation, or project report, allowing distribution as specified below. I certify that the version I submitted is the same as that approved by my advisory committee. I hereby grant to NC State University or its agents the non-exclusive license to archive and make accessible, under the conditions specified below, my thesis, dissertation, or project report in whole or in part in all forms of media, now or hereafter known. I retain all other ownership rights to the copyright of the thesis, dissertation or project report. I also retain the right to use in future works (such as articles or books) all or part of this thesis, dissertation, or project report.en_US
dc.subjectposterior consistencyen_US
dc.subjecttime seriesen_US
dc.subjectPoisson equationen_US
dc.subjectno-gaps algorithmen_US
dc.subjectMarkov processen_US
dc.subjectDirichlet processen_US
dc.subjectuniformly exponentially consistent testsen_US
dc.titleDirichlet Process Mixture Models For Markov Processesen_US

Files

Original bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
etd.pdf
Size:
1.35 MB
Format:
Adobe Portable Document Format

Collections