Modelling non-stationary functions with Gaussian processes

2019 
Gaussian processes (GP's) are a central piece of non-parametric Bayesian methods, which allow placing priors over functions in settings such as classification and regression. The prior is described using a kernel function that encodes a similarity between any two points in the input space, and thus defines the properties of functions that are modelled by the GP. In applying Gaussian processes the choice of the kernel is crucial, and the commonly used standard kernels often offer unsatisfactory performance due to making the assumption of stationarity. This thesis presents approaches in modelling non-stationarity from two different perspectives in Gaussian processes. First, this thesis presents a formulation of a non-stationary spectral mixture kernel for univariate outputs, focusing on modelling the non-stationarity in the input space. The construction is based on the spectral mixture (SM) kernel, which has been derived for stationary functions using the Fourier duality implied by Bochner's theorem. The work done in this thesis extends the SM kernel into the non-stationary case. This is achieved by two complementary approaches, based on replacing the constant frequency parameters by input-dependent functions. The first approach is based on modelling the latent functions describing the frequency surface as Gaussian processes. In the second approach the functions are directly modelled as a neural network, parameters of which are optimized with respect to the variational evidence lower bound (ELBO). Second, this thesis presents a kernel suitable for modelling non-stationary couplings between multiple output variables of interest in the context of multi-task or multi-output GP regression. The construction of the kernel is based on a Hadamard product of two kernels, which model the different aspects of dependencies between the outputs. The part of the kernel modelling the input-dependent couplings is based on a generalized Wishart process, which is a stochastic process on time-varying positive-definite matrices, in this case describing the changing dependencies between the outputs. The proposed Hadamard product kernel is applied in a latent factor model to enrich the latent variable prior distribution, that is, to model correlations within the latent variables explicitly. This results in the latent correlation Gaussian process model (LCGP). This thesis additionally considers novel, flexible models for classification of multi-view data, specifically one based on a mixture of group factor analyzers (GFA). The model has a close relationship to the LCGP that builds a classifier in the latent variable space, while the classifier in the GFA mixture is based on the mixture assignments. GFA also allows modelling dependencies between groups of variables, which is not done by the LCGP. Applying Gaussian processes and adapting the proposed multi-output kernel would make the multi-view model even more general. The methods introduced in this thesis now allow modelling non-stationary functions in Gaussian processes in a flexible…
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []