Hierarchical Deep Gaussian Processes Latent Variable Model via Expectation Propagation

2021 
Gaussian Processes (GPs) and related unsupervised learning techniques such as Gaussian Process Latent Variable Models (GP-LVMs) have been very successful in the accurate modeling of high-dimensional data based on limited amounts of training data. Usually these techniques have the disadvantage of a high computational complexity. This makes it difficult to solve the associated learning problems for complex hierarchical models and large data sets, since the related computations, as opposed to neural networks, are not node-local. Combining sparse approximation techniques for GPs and Power Expectation Propagation, we present a framework for the computationally efficient implementation of hierarchical deep Gaussian process (latent variable) models. We provide implementations of this approach on the GPU as well as on the CPU, and we benchmark efficiency comparing different optimization algorithms. We present the first implementation of such deep hierarchical GP-LVMs and demonstrate the computational efficiency of our GPU implementation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []