Bayesian Nonlinear Function Estimation with Approximate Message Passing

2019 
In many areas, massive amounts of data are collected and analyzed in order to explain various phenomena. Variables or features that may explain the phenomena of interest are observed, and the goal is to learn a (possibly) nonlinear function that relates the explanatory variables to phenomena of interest. To perform nonlinear function estimation, we convert a nonlinear inverse problem to a linear one using a polynomial kernel expansion. These kernels increase the feature set, and often result in poorly conditioned matrices. Nonetheless, we show that the matrix in our linear inverse problem contains only mild linear correlations among columns, allowing us to estimate the coefficients vector using approximate message passing (AMP), an algorithmic framework for signal reconstruction. While we model the coefficients within a Bayesian setting, which is limited in scope, AMP offers Bayes-optimal signal reconstruction quality. Numerical results confirm that our AMP-based approach learns the function better than existing approaches such as LASSO, offering markedly lower error in predicting test data.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    34
    References
    0
    Citations
    NaN
    KQI
    []