Neural networks with superexpressive activations and integer weights.

2021 
An example of an activation function $\sigma$ is given such that networks with activations $\{\sigma, \lfloor\cdot\rfloor\}$, integer weights and a fixed architecture depending on $d$ approximate continuous functions on $[0,1]^d$. The range of integer weights required for $\varepsilon$-approximation of H\"older continuous functions is derived, which leads to a convergence rate of order $n^{\frac{-2\beta}{2\beta+d}}\log_2n$ for neural network regression estimation of unknown $\beta$-H\"older continuous function with given $n$ samples.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    6
    References
    0
    Citations
    NaN
    KQI
    []