Bayesian estimation of the inverted Beta-Liouville mixture models with extended variational inference

2021 
This paper addresses the problem of the Bayesian estimation of the inverted Beta-Liouville mixture model (IBLMM), which has a fairly flexible positive data modeling capability. This problem does not usually admit an analytically tractable solution. Sampling approaches (e.g., Markov chain Monte Carlo (MCMC)) can be used to address this problem. However, these approaches are usually computationally demanding, and as a result, they may be impractical for real-world applications. Therefore, we adopt the recently proposed extended variational inference (EVI) framework to address this problem in an elegant way. First, some lower bound approximations are introduced to the evidence lower bound (ELBO) (i.e., the original objective function) in the conventional variational inference (VI) framework, which yields a computationally tractable lower bound. Then, we can derive a form-closed analytical solution by taking this bound as the new objective function and optimizing it with respect to individual variational factors. We verify the effectiveness of this method by using it in two real applications, namely, text categorization and face detection.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []