AI News Hub Logo

AI News Hub

Fast Mixing of Data Augmentation Algorithms: Bayesian Probit, Logit, and Lasso Regression

stat.ML updates on arXiv.org
Holden Lee, Kexin Zhang

arXiv:2412.07999v3 Announce Type: replace-cross Abstract: We propose using a modified conductance-based method to study the mixing time of an important class of two-block Gibbs samplers, the data augmentation (DA) algorithm. %, which is of prominent interest in both theoretical and empirical research. Using this method, we prove the first non-asymptotic polynomial upper bounds on mixing times of three important DA algorithms: DA algorithms for Bayesian Probit regression (Albert and Chib, 1993, ProbitDA) and Bayesian Logit regression (Polson, Scott, and Windle, 2013, LogitDA), and Bayesian Lasso Regression (Park and Casella, 2008, Rajaratnam et al., 2015, LassoDA). Concretely, for ProbitDA and LogitDA, we demonstrate a tight bound that explicitly depends on the design matrix and prior covariance matrix. Under the assumption that data are independently generated from either a sub-Gaussian or log-concave distribution and properly scaled, the bound implies that with $\eta$-warm start, parameter dimension $d$, and sample size $n$, with high probability over data, the two algorithms require $\mathcal{O}\left(n\log \left(\frac{\log \eta}{\epsilon}\right)\right)$ steps to obtain samples with at most $\epsilon$ error in TV, KL, or $\chi^2$ distance. Meanwhile, we show that under minimal data assumptions, LassoDA requires $\mathcal{O}\left(d^2(d\log d +n \log n)^2 \log \left(\frac{\eta}{\epsilon}\right)\right)$ steps to achieve $\epsilon$-accuracy in TV distance. The results are generally applicable to settings with large $n$ and large $d$, including settings with highly imbalanced response data in Probit and Logit regression. We compare them with the best known guarantees of Langevin Monte Carlo and Metropolis Adjusted Langevin Algorithm. We evaluate our theoretical results using numerical examples, and discuss the mixing times of the three algorithms under feasible initialization.