Closed-Form Entropy Limits - A Tool to Monitor Likelihood Optimization of Probabilistic Generative Models

2012 
The maximization of the data likelihood under a given probabilistic generative model is the essential goal of many algorithms for unsupervised learning. If expectation maximization is used for optimization, a lower bound on the data likelihood, the free-energy, is optimized. The parameter-dependent part of the free-energy (the difference between free-energy and posterior entropy) is the essential entity in the derivation of learning algorithms. Here we show that for many common generative models the optimal values of the parameter-dependent part can be derived in closed-form. These closed-form expressions are hereby given as sums of the negative (differential) entropies of the individual model distributions. We apply our theoretical results to derive such closed-form expressions for a number of common and recent models, including probabilistic PCA, factor analysis, different versions of sparse coding, and Linear Dynamical Systems. The main contribution of this work is theoretical but we show how the derived results can be used to efficiently compute free-energies, and how they can be used for consistency checks of learning algorithms.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    21
    References
    1
    Citations
    NaN
    KQI
    []