|dc.description.abstract||Markov chain Monte Carlo (MCMC) sampling methods often do not scale well to large datasets, so there has been an increased interest in approximate Markov chain Monte Carlo (aMCMC) sampling methods. We propose two different aMCMC methods. For the first method, we propose a new distribution, called the soft tMVN distribution, which provides a smooth approximation to the truncated multivariate normal (tMVN) distribution with linear constraints. The soft tMVN distribution can be used to approximate simulations from a multivariate truncated normal distribution with linear constraints, or itself as a prior in shape-constrained problems. We provide theoretical support to the approximation capability of the soft tMVN and provide further empirical evidence thereof. We then develop an aMCMC method for Bayesian monotone single-index modeling. We replace the usual tMVN prior with the soft tMVN prior and show that using the soft tMVN prior gives similar statistical performance while the run-time is significantly faster.
The second aMCMC method is a multivariate convex regression method. In it, we approximate the max of affine functions with the softmax of affine functions. Convex regression methods that use the max of affine functions appear to do well in traditional frequentist settings, but does not scale well to large data in Bayesian settings. We propose the softmax-affine convex (SMA) regression method which replaces the max with the softmax function. The softmax function is a smooth function that approximates the max of affine functions. This allows gradients to be computed, which makes the Hamiltonian Monte Carlo (HMC) algorithm a natural choice for sampling from the posterior. We specify the priors for SMA and use Stan, a default HMC algorithm, to sample from the posterior. We provide empirical evidence that SMA regression is comparable to existing convex regression methods. We also provide a method for choosing the number of affine functions in the softmax function.||en