The Gibbs sampler is a simple but very powerful algorithm used to simulate from a complex high-dimensional distribution. It is particularly useful in Bayesian analysis when a complex Bayesian model involves a number of model parameters and the conditional posterior distribution of each component given the others can be derived as a standard distribution. In the presence of a strong correlation structure among components, however, the Gibbs sampler can be criticized for its slow convergence. Here we discuss several algorithmic strategies such as blocking, collapsing, and partial collapsing that are available for improving the convergence characteristics of the Gibbs sampler. This article is categorized under: Statistical and Graphical Methods of Data Analysis > Markov Chain Monte Carlo Statistical and Graphical Methods of Data Analysis > Bayesian Methods and Theory Statistical and Graphical Methods of Data Analysis > Sampling.
|Journal||Wiley Interdisciplinary Reviews: Computational Statistics|
|Publication status||Accepted/In press - 2021|
Bibliographical noteFunding Information:
This research was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (2020R1A2C1A01005949).
National Research Foundation of Korea, Grant/Award Number: 2020R1A2C1A01005949 Funding information
© 2021 Wiley Periodicals LLC.
All Science Journal Classification (ASJC) codes
- Statistics and Probability