Exponential ReLU DNN Expression of Holomorphic Maps in High Dimension

2021 
For a parameter dimension $$d\in {\mathbb {N}}$$ , we consider the approximation of many-parametric maps $$u: [-\,1,1]^d\rightarrow {\mathbb R}$$ by deep ReLU neural networks. The input dimension d may possibly be large, and we assume quantitative control of the domain of holomorphy of u: i.e., u admits a holomorphic extension to a Bernstein polyellipse $${{\mathcal {E}}}_{\rho _1}\times \cdots \times {{\mathcal {E}}}_{\rho _d} \subset {\mathbb {C}}^d$$ of semiaxis sums $$\rho _i>1$$ containing $$[-\,1,1]^{d}$$ . We establish the exponential rate $$O(\exp (-\,bN^{1/(d+1)}))$$ of expressive power in terms of the total NN size N and of the input dimension d of the ReLU NN in $$W^{1,\infty }([-\,1,1]^d)$$ . The constant $$b>0$$ depends on $$(\rho _j)_{j=1}^d$$ which characterizes the coordinate-wise sizes of the Bernstein-ellipses for u. We also prove exponential convergence in stronger norms for the approximation by DNNs with more regular, so-called “rectified power unit” activations. Finally, we extend DNN expression rate bounds also to two classes of non-holomorphic functions, in particular to d-variate, Gevrey-regular functions, and, by composition, to certain multivariate probability distribution functions with Lipschitz marginals.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    38
    References
    46
    Citations
    NaN
    KQI
    []