Random projections for Bayesian regression

2017 
This article deals with random projections applied as a data reduction technique for Bayesian regression analysis. We show sufficient conditions under which the entire d-dimensional distribution is approximately preserved under random projections by reducing the number of data points from n to $$k\in O({\text {poly}}(d/\varepsilon ))$$k?O(poly(d/?)) in the case $$n\gg d$$n?d. Under mild assumptions, we prove that evaluating a Gaussian likelihood function based on the projected data instead of the original data yields a $$(1+O(\varepsilon ))$$(1+O(?))-approximation in terms of the $$\ell _2$$l2 Wasserstein distance. Our main result shows that the posterior distribution of Bayesian linear regression is approximated up to a small error depending on only an $$\varepsilon $$?-fraction of its defining parameters. This holds when using arbitrary Gaussian priors or the degenerate case of uniform distributions over $$\mathbb {R}^d$$Rd for $$\beta $$s. Our empirical evaluations involve different simulated settings of Bayesian linear regression. Our experiments underline that the proposed method is able to recover the regression model up to small error while considerably reducing the total running time.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    88
    References
    29
    Citations
    NaN
    KQI
    []