Projected Stein Variational Gradient Descent

Peng Chen,Omar Ghattas

The curse of dimensionality is a longstanding challenge in Bayesianinference in high dimensions. In this work, we propose a {projected Stein variational gradient descent} (pSVGD) method to overcome thischallenge by exploiting the fundamental property of intrinsic lowdimensionality of the data informed subspace stemming fromill-posedness of such problems. We adaptively construct the subspaceusing a gradient information matrix of the log-likelihood, and applypSVGD to the much lower-dimensional coefficients of the parameterprojection. The method is demonstrated to be more accurate andefficient than SVGD. It is also shown to be more scalable with respectto the number of parameters, samples, data points, and processor coresvia experiments with parameters dimensions ranging from thehundreds to the tens of thousands.