Show simple item record

dc.contributor.advisorLiang, Faming
dc.creatorJin, Ick Hoon
dc.date.accessioned2013-12-16T19:55:02Z
dc.date.available2013-12-16T19:55:02Z
dc.date.created2011-08
dc.date.issued2011-06-27
dc.date.submittedAugust 2011
dc.identifier.urihttps://hdl.handle.net/1969.1/150938
dc.description.abstractIn this dissertation, we have proposed two new algorithms for statistical inference for models with intractable normalizing constants: the Monte Carlo Metropolis-Hastings algorithm and the Bayesian Stochastic Approximation Monte Carlo algorithm. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. At each iteration, it replaces the unknown normalizing constant ratio by a Monte Carlo estimate. Although the algorithm violates the detailed balance condition, it still converges, as shown in the paper, to the desired target distribution under mild conditions. The BSAMC algorithm works by simulating from a sequence of approximated distributions using the SAMC algorithm. A strong law of large numbers has been established for BSAMC estimators under mild conditions. One significant advantage of our algorithms over the auxiliary variable MCMC methods is that they avoid the requirement for perfect samples, and thus it can be applied to many models for which perfect sampling is not available or very expensive. In addition, although the normalizing constant approximation is also involved in BSAMC, BSAMC can perform very robustly to initial guesses of parameters due to the powerful ability of SAMC in sample space exploration. BSAMC has also provided a general framework for approximated Bayesian inference for the models for which the likelihood function is intractable: sampling from a sequence of approximated distributions with their average converging to the target distribution. With these two illustrated algorithms, we have demonstrated how the SAMCMC method can be applied to estimate the parameters of ERGMs, which is one of the typical examples of statistical models with intractable normalizing constants. We showed that the resulting estimate is consistent, asymptotically normal and asymptotically efficient. Compared to the MCMLE and SSA methods, a significant advantage of SAMCMC is that it overcomes the model degeneracy problem. The strength of SAMCMC comes from its varying truncation mechanism, which enables SAMCMC to avoid the model degeneracy problem through re-initialization. MCMLE and SSA do not possess the re-initialization mechanism, and tend to converge to a solution near the starting point, so they often fail for the models which suffer from the model degeneracy problem.en
dc.format.mimetypeapplication/pdf
dc.subjectAutulogistic Modelen
dc.subjectIsing Modelen
dc.subjectStochastic Approximation Monte Carloen
dc.subjectExponential Random Graph Modelsen
dc.subjectMarkov Chain Monte Carloen
dc.subjectIntractable Normalizing Constantsen
dc.titleStatistical Inference for Models with Intractable Normalizing Constantsen
dc.typeThesisen
thesis.degree.departmentStatisticsen
thesis.degree.disciplineStatisticsen
thesis.degree.grantorTexas A & M Universityen
thesis.degree.nameDoctor of Philosophyen
thesis.degree.levelDoctoralen
dc.contributor.committeeMemberDahl, David B
dc.contributor.committeeMemberSinha, Samiran
dc.contributor.committeeMemberYoon, Byung-jun
dc.type.materialtexten
dc.date.updated2013-12-16T19:55:02Z


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record