Sparse approximation of triangular transports on bounded domains.
2020
Let $\rho$ and $\pi$ be two probability measures on $[-1,1]^d$ with positive and analytic Lebesgue densities. We investigate the approximation of the unique triangular monotone (Knothe-Rosenblatt) transport $T:[-1,1]^d\to [-1,1]^d$, such that the pushforward $T_\sharp\rho$ equals $\pi$. It is shown that for $d\in\mathbb{N}$ there exist approximations $\tilde T$ of $T$ based on either sparse polynomial expansions or ReLU networks, such that the distance between $\tilde T_\sharp\rho$ and $\pi$ decreases exponentially. More precisely, we show error bounds of the type $\exp(-\beta N^{1/d})$ (or $\exp(-\beta N^{1/(d+1)})$ for neural networks), where $N$ refers to the dimension of the ansatz space (or the size of the network) containing $\tilde T$; the notion of distance comprises, among others, the Hellinger distance and the Kullback--Leibler divergence. The construction guarantees $\tilde T$ to be a monotone triangular bijective transport on the hypercube $[-1,1]^d$. Analogous results hold for the inverse transport $S=T^{-1}$. The proofs are constructive, and we give an explicit a priori description of the ansatz space, which can be used for numerical implementations. Additionally we discuss the high-dimensional case: for $d=\infty$ a dimension-independent algebraic convergence rate is proved for a class of probability measures occurring widely in Bayesian inference for uncertainty quantification, thus verifying that the curse of dimensionality can be overcome in this setting.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
43
References
5
Citations
NaN
KQI