Ever-increasing computational power, along with ever-more sophisticated statistical computing techniques, is making it possible to fit ever-more complex statistical models. Among the more computationally intensive methods, the Gibbs sampler is popular because of its simplicity and power to effectively generate samples from a high-dimensional probability distribution. Despite its simple implementation and description, however, the Gibbs sampler is criticized for its sometimes slow convergence, especially when it is used to fit highly structured complex models. Here we present partially collapsed Gibbs sampling strategies that improve the convergence by capitalizing on a set of functionally incompatible conditional distributions. Such incompatibility generally is avoided in the construction of a Gibbs sampler, because the resulting convergence properties are not well understood. We introduce three basic tools (marginalization, permutation, and trimming) that allow us to transform a Gibbs sampler into a partially collapsed Gibbs sampler with known stationary distribution and faster convergence.
Bibliographical noteFunding Information:
David A. van Dyk is Professor, Department of Statistics, University of California Irvine, Irvine, CA 92697 (E-mail: firstname.lastname@example.org). Taeyoung Park is Assistant Professor, Department of Statistics, University of Pittsburgh, Pittsburgh, PA 15260 (E-mail: email@example.com). This project was supported in part by National Science Foundation grant DMS-01-04129, DMS-04-38240, and DMS-04-06085 and by National Aeronautic and Space Administration contracts NAS8-39073 and NAS8-03060 (CXC).
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Statistics, Probability and Uncertainty