Click here to flash read.
In Bayesian inference, a simple and popular approach to reduce the burden of
computing high dimensional integrals against a posterior $\pi$ is to make the
Laplace approximation $\hat\gamma$. This is a Gaussian distribution, so
computing $\int fd\pi$ via the approximation $\int fd\hat\gamma$ is
significantly less expensive. In this paper, we make two general contributions
to the topic of high-dimensional Laplace approximations, as well as a third
contribution specific to a logistic regression model. First, we tighten the
dimension dependence of the error $|\int fd\pi - \int fd\hat\gamma|$ for a
broad class of functions $f$. Second, we derive a higher-accuracy approximation
$\hat\gamma_S$ to $\pi$, which is a skew-adjusted modification to $\hat\gamma$.
Our third contribution - in the setting of Bayesian inference for logistic
regression with Gaussian design - is to use the first two results to derive
upper bounds which hold uniformly over different sample realizations, and lower
bounds on the Laplace mean approximation error. In particular, we prove a
skewed Bernstein-von Mises Theorem in this logistic regression setting.
No creative common's license