Show simple item record

dc.contributor.advisorPati, Debdeep
dc.contributor.advisorBhattacharya, Anirban
dc.creatorGuha, Biraj Subhra
dc.date.accessioned2022-01-24T22:19:50Z
dc.date.available2022-01-24T22:19:50Z
dc.date.created2021-08
dc.date.issued2021-07-16
dc.date.submittedAugust 2021
dc.identifier.urihttps://hdl.handle.net/1969.1/195138
dc.description.abstractI provide statistical guarantees for Bayesian variational boosting by proposing a novel small bandwidth Gaussian mixture variational family. We employ a functional version of Frank-Wolfe optimization as our variational algorithm and study frequentist properties of the iterative boosting updates. Comparisons are drawn to the recent literature on boosting, describing how the choice of the variational family and the discrepancy measure affect both convergence and finite-sample statistical properties of the optimization routine. Specifically, we first demonstrate stochastic boundedness of the boosting iterates with respect to the data generating distribution. We next integrate this within our algorithm to provide an explicit convergence rate, ending with a result on the required number of boosting updates. Next, I develop a framework to study posterior contraction rates in sparse high dimensional generalized linear models (GLM). We introduce a new family of GLMs, denoted by clipped GLM, which subsumes many standard GLMs and makes minor modification of the rest. With a sparsity inducing prior on the regression coefficients, we delineate sufficient conditions on true data generating density that leads to minimax optimal rates of posterior contraction of the coefficients in l_1 norm. Our key contribution is to develop sufficient conditions commensurate with the geometry of the clipped GLM family, propose prior distributions which do not require any knowledge of the true parameters and avoid any assumption on the growth rate of the true coefficient vectoren
dc.format.mimetypeapplication/pdf
dc.language.isoen
dc.subjectApproximate Bayesian Inferenceen
dc.subjectVariational Boostingen
dc.subjectFrank--Wolfe Algorithmen
dc.subjectConvergence Rateen
dc.subjectKullback-Leibler Divergenceen
dc.subjectGaussian Mixturesen
dc.subjectHigh-dimensionen
dc.subjectSparse Regressionen
dc.subjectGeneralized Linear Modelsen
dc.subjectPosterior Convergenceen
dc.subjectModel Selectionen
dc.subjectAdaptive Estimationen
dc.subjectSpike-and-slab Prioren
dc.subjectMinimax Rateen
dc.titleTheoretical Guarantees for Bayesian Generalized Linear Regression And Variational Boostingen
dc.typeThesisen
thesis.degree.departmentStatisticsen
thesis.degree.disciplineStatisticsen
thesis.degree.grantorTexas A&M Universityen
thesis.degree.nameDoctor of Philosophyen
thesis.degree.levelDoctoralen
dc.contributor.committeeMemberCline, Daren
dc.contributor.committeeMemberCarroll, Raymond
dc.contributor.committeeMemberNarayanan, Krishna
dc.type.materialtexten
dc.date.updated2022-01-24T22:19:50Z
local.etdauthor.orcid0000-0002-1806-936X


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record