A Monte Carlo metropolis-hastings algorithm for sampling from distributions with intractable normalizing constants

Faming Liang, Ick Hoon Jin

Research output: Contribution to journalLetterpeer-review

7 Citations (Scopus)

Abstract

Simulating from distributions with intractable normalizing constants has been a long-standing problem inmachine learning. In this letter, we propose a new algorithm, the Monte Carlo Metropolis-Hastings (MCMH) algorithm, for tackling this problem. The MCMH algorithm is a Monte Carlo version of the Metropolis-Hastings algorithm. It replaces the unknown normalizing constant ratio by a Monte Carlo estimate in simulations, while still converges, as shown in the letter, to the desired target distribution under mild conditions. The MCMH algorithm is illustrated with spatial autologistic models and exponential random graph models. Unlike other auxiliary variable Markov chain Monte Carlo (MCMC) algorithms, such as the Møller and exchange algorithms, the MCMH algorithm avoids the requirement for perfect sampling, and thus can be applied to many statistical models for which perfect sampling is not available or very expensive. TheMCMHalgorithm can also be applied to Bayesian inference for random effect models and missing data problems that involve simulations from a distribution with intractable integrals.

Original languageEnglish
Pages (from-to)2199-2234
Number of pages36
JournalNeural Computation
Volume25
Issue number8
DOIs
Publication statusPublished - 2013

All Science Journal Classification (ASJC) codes

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Fingerprint Dive into the research topics of 'A Monte Carlo metropolis-hastings algorithm for sampling from distributions with intractable normalizing constants'. Together they form a unique fingerprint.

Cite this