Efficient Sequential Monte-Carlo Samplers for Bayesian Inference

2016 
In many problems, complex non-Gaussian and/or nonlinear models are required to accurately describe a physical system of interest. In such cases, Monte-Carlo algorithms are remarkably flexible and extremely powerful approaches to solve such inference problems. However, in the presence of a high-dimensional and/or multimodal posterior distribution, it is widely documented that standard Monte-Carlo techniques could lead to poor performance. In this paper, the study is focused on a Sequential Monte-Carlo (SMC) sampler framework, a more robust and efficient Monte-Carlo algorithm. Although this approach presents many advantages over traditional Monte-Carlo methods, the potential of this emergent technique is, however, largely underexploited in signal processing. In this paper, we aim at proposing some novel strategies to improve the efficiency and facilitate practical implementation of the SMC sampler. First, we propose an automatic and adaptive strategy that selects the sequence of distributions within the SMC sampler that minimizes the asymptotic variance of the estimator of the posterior normalization constant. The second original contribution we present improves the global efficiency of the SMC sampler by introducing a novel correction mechanism that allows the use of the particles generated through all of the iterations of the algorithm (instead of only particles from the last iteration). This is a significant contribution as it removes the need to discard a large portion of the samples obtained, as is standard in standard SMC methods. This will improve estimation performance in practical settings where the computational budget is important to consider.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    44
    References
    22
    Citations
    NaN
    KQI
    []