Comparison between Adam, AdaMax and Adam W optimizers to implement a Weather Forecast based on Neural Networks for the Andean city of Quito

2021 
The main function of an optimizer is to determine in what measure to change the weights and the learning rate of the neural network to reduce losses. One of the best known optimizers is Adam, which main advantage is the invariance of the magnitudes of the parameter updates with respect to the change of scale of the gradient. However, other optimizers are often chosen because they generalize in a better manner. AdamW is a variant of Adam where the weight decay is performed only after controlling the parameter-wise step size. In order to present a comparative scenario for optimizers in the present work, a Temperature Forecast for the Andean city of Quito using a neural network structure with uncertainty reduction was implemented and three optimizers (Adam, AdaMax and AdamW) were analyzed. In order to do the comparison three error metrics were obtained per hour in order to determine the effectiveness of the prediction. From the analysis it can be seen that Adam and AdaMax behave similarly reaching a maximum MSE per hour of 2.5°C nevertheless AdamW allows to reduce this error around 1.3°C.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    17
    References
    0
    Citations
    NaN
    KQI
    []