language-icon Old Web
English
Sign In

Expectation propagation

Expectation propagation (EP) is a technique in Bayesian machine learning. Expectation propagation (EP) is a technique in Bayesian machine learning. EP finds approximations to a probability distribution. It uses an iterative approach that leverages the factorization structure of the target distribution. It differs from other Bayesian approximation approaches such as variational Bayesian methods. More specifically, suppose we wish to approximate an intractable probability distribution p ( x ) {displaystyle p(mathbf {x} )} with a tractable distribution q ( x ) {displaystyle q(mathbf {x} )} . Expectation propagation achieves this approximation by minimizing the Kullback-Leibler divergence K L ( p | | q ) {displaystyle mathrm {KL} (p||q)} . Variational Bayesian methods minimize K L ( q | | p ) {displaystyle mathrm {KL} (q||p)} instead. If q ( x ) {displaystyle q(mathbf {x} )} is a Gaussian N ( x | μ , Σ ) {displaystyle {mathcal {N}}(mathbf {x} |mu ,Sigma )} , then K L ( p | | q ) {displaystyle mathrm {KL} (p||q)} is minimized with μ {displaystyle mu } and Σ {displaystyle Sigma } being equal to the mean of p ( x ) {displaystyle p(mathbf {x} )} and the covariance of p ( x ) {displaystyle p(mathbf {x} )} , respectively; this is called moment matching. Expectation propagation via moment matching plays a vital role in approximation for indicator functions that appear when deriving the message passing equations for TrueSkill.

[ "Gaussian process" ]
Parent Topic
Child Topic
    No Parent Topic