Bayesian methods have become very popular in signal processing lately, even though performing exact Bayesian inference is often unfeasible due to the lack of analytical expressions for optimal Bayesian estimators. In order to overcome this problem, Monte Carlo (MC) techniques are frequently used. Several classes of MC schemes have been developed, including Markov Chain Monte Carlo (MCMC) methods, particle filters and population Monte Carlo approaches. In this paper, we concentrate on the Gibbs-type approach, where automatic and fast samplers are needed to draw from univariate (full-conditional) densities. The Adaptive Rejection Metropolis Sampling (ARMS) technique is widely used within Gibbs sampling, but suffers from an important drawback: an incomplete adaptation of the proposal in some cases. In this work, we propose an alternative adaptive MCMC algorithm (IA(2)RMS) that overcomes this limitation, speeding up the convergence of the chain to the target, allowing us to simplify the construction of the sequence of proposals, and thus reducing the computational cost of the entire algorithm. Note that, although IA(2)RMS has been developed as an extremely efficient MCMC-within-Gibbs sampler, it also provides an excellent performance as a stand-alone algorithm when sampling from univariate distributions. In this case, the convergence of the proposal to the target is proved and a bound on the complexity of the proposal is provided. Numerical results, both for univariate (stand - alone IA(2)RMS) and multivariate (IA(2)RMS - within - Gibbs) distributions, show that IA(2)RMS out-performs ARMS and other classical techniques, providing a correlation among samples close to zero.

Independent Doubly Adaptive Rejection Metropolis Sampling Within Gibbs Sampling

Martino, Luca;
2015-01-01

Abstract

Bayesian methods have become very popular in signal processing lately, even though performing exact Bayesian inference is often unfeasible due to the lack of analytical expressions for optimal Bayesian estimators. In order to overcome this problem, Monte Carlo (MC) techniques are frequently used. Several classes of MC schemes have been developed, including Markov Chain Monte Carlo (MCMC) methods, particle filters and population Monte Carlo approaches. In this paper, we concentrate on the Gibbs-type approach, where automatic and fast samplers are needed to draw from univariate (full-conditional) densities. The Adaptive Rejection Metropolis Sampling (ARMS) technique is widely used within Gibbs sampling, but suffers from an important drawback: an incomplete adaptation of the proposal in some cases. In this work, we propose an alternative adaptive MCMC algorithm (IA(2)RMS) that overcomes this limitation, speeding up the convergence of the chain to the target, allowing us to simplify the construction of the sequence of proposals, and thus reducing the computational cost of the entire algorithm. Note that, although IA(2)RMS has been developed as an extremely efficient MCMC-within-Gibbs sampler, it also provides an excellent performance as a stand-alone algorithm when sampling from univariate distributions. In this case, the convergence of the proposal to the target is proved and a bound on the complexity of the proposal is provided. Numerical results, both for univariate (stand - alone IA(2)RMS) and multivariate (IA(2)RMS - within - Gibbs) distributions, show that IA(2)RMS out-performs ARMS and other classical techniques, providing a correlation among samples close to zero.
2015
Adaptive MCMC
adaptive rejection Metropolis sampling
bayesian inference
Gibbs sampler
metropolis-hastings within Gibbs sampling
monte carlo methods
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.11769/614610
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 67
  • ???jsp.display-item.citation.isi??? 55
social impact