Show simple item record

dc.contributor.advisorBhattacharya, Anirban
dc.creatorLarsen, Allyson Elaine
dc.date.accessioned2021-02-19T19:36:25Z
dc.date.available2021-02-19T19:36:25Z
dc.date.created2020-08
dc.date.issued2020-05-21
dc.date.submittedAugust 2020
dc.identifier.urihttps://hdl.handle.net/1969.1/192454
dc.description.abstractMarkov chain Monte Carlo (MCMC) sampling methods often do not scale well to large datasets, so there has been an increased interest in approximate Markov chain Monte Carlo (aMCMC) sampling methods. We propose two different aMCMC methods. For the first method, we propose a new distribution, called the soft tMVN distribution, which provides a smooth approximation to the truncated multivariate normal (tMVN) distribution with linear constraints. The soft tMVN distribution can be used to approximate simulations from a multivariate truncated normal distribution with linear constraints, or itself as a prior in shape-constrained problems. We provide theoretical support to the approximation capability of the soft tMVN and provide further empirical evidence thereof. We then develop an aMCMC method for Bayesian monotone single-index modeling. We replace the usual tMVN prior with the soft tMVN prior and show that using the soft tMVN prior gives similar statistical performance while the run-time is significantly faster. The second aMCMC method is a multivariate convex regression method. In it, we approximate the max of affine functions with the softmax of affine functions. Convex regression methods that use the max of affine functions appear to do well in traditional frequentist settings, but does not scale well to large data in Bayesian settings. We propose the softmax-affine convex (SMA) regression method which replaces the max with the softmax function. The softmax function is a smooth function that approximates the max of affine functions. This allows gradients to be computed, which makes the Hamiltonian Monte Carlo (HMC) algorithm a natural choice for sampling from the posterior. We specify the priors for SMA and use Stan, a default HMC algorithm, to sample from the posterior. We provide empirical evidence that SMA regression is comparable to existing convex regression methods. We also provide a method for choosing the number of affine functions in the softmax function.en
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.subjectApproximateen
dc.subjectMarkov chain Monte Carloen
dc.titleApproximation Schemes to Simplify Posterior Computationen
dc.typeThesisen
thesis.degree.departmentStatisticsen
thesis.degree.disciplineStatisticsen
thesis.degree.grantorTexas A&M Universityen
thesis.degree.nameDoctor of Philosophyen
thesis.degree.levelDoctoralen
dc.contributor.committeeMemberGaynanova, Irina
dc.contributor.committeeMemberMallick, Bani
dc.contributor.committeeMemberQian, Xiaoning
dc.type.materialtexten
dc.date.updated2021-02-19T19:36:25Z
local.etdauthor.orcid0000-0002-2248-0581


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record