The Bayes factor has become an important tool for model selection. The marginal likelihoods are also important because they can be used to rank the models. In fact, the Bayes factor is the ratio of the marginal likelihoods for two models with proper prior densities. We discuss the marginal likelihood for a class of generalized linear models used in small area estimation for mortality data analysis. Computation in these models is intensive and requires the implementation of Markov chain Monte Carlo (MCMC) methods. A sophisticated method for computing the marginal likelihoods for generalized linear models using reduced Metropolis-Hastings (M-H) samplers has recently been introduced. Also, a much simpler method that uses the Laplace approximation has been proposed. Our method lies between these two in simplicity, and it uses importance sampling via a simple output analysis from a MCMC sampler. We also show that the new method can be approximated without using the MCMC sampler, We illustrate our methods for two Poisson regression models which have been used for mapping mortality data, and we compare them with the methods based on reduced M-H samplers and the Laplace approximation. The method based on the Laplace approximation gives smaller marginal log-likelihoods than the other three methods, but they select the same model, We use a simulation study in which the marginal likelihood is used to investigate the improvement in the goodness of fit of one hierarchical Poisson regression model over the other.
All Science Journal Classification (ASJC) codes
- Statistics and Probability
- Modelling and Simulation
- Statistics, Probability and Uncertainty
- Applied Mathematics