A New Class of Polynomial Activation Functions of Deep Learning for Precipitation Forecasting
2022
Precipitation forecasting, modeled as an important chaotic system in earth system science, is not explicitly solved with theory-driven models. In recent years, deep learning models have achieved great success in various applications including rainfall prediction. However, these models work in an image processing manner regardless of the nature of a physical system. We found that the non-linearity relationships learned by deep learning models, which mostly rely on the activation functions, are commonly weighted piecewise continuous functions with bounded first-order derivatives. In contrast, the polynomial is one of the most widely used classes of functions for theory-driven models, applied to numerical approximation, dynamic system modeling, etc.. Researchers started to use the polynomial activation functions (Pacs in short) for neural networks from the 1990s. In recent years, with bloomed researches that apply deep learning to scientific problems, it is weird that such a powerful class of basis functions is rarely used. In this paper, we investigate it and argue that, even though polynomials are good at information extraction, it is too fragile to train stably. We finally solve its serious data flow explosion problem with Chebyshev polynomials and prepended normalization, which enables networks to go deep with Pacs. To enhance the robustness of training, a normalization called Range Norm is further proposed. Performance on synthetic dataset and summer precipitation prediction task validates the necessity of such a class of activation functions to simulate complex physical mechanisms. The new tool for deep learning enlightens a new way of automatic theoretical physics analysis.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
52
References
0
Citations
NaN
KQI