Quality of randomness and node dropout regularization for fitting neural networks

2022 
Quality of randomness in generating random numbers is an attribute manifested by a sufficiently random process, and a sufficiently large sample size. To assess it, various statistical tests for it have been proposed in the past. The application area for random number generation is wide in natural sciences, and one of the more prominent and widely adopted is machine learning, where bounded randomness or stochastic random number generation has been utilized in various tasks. The artificial neural networks used for example in deep learning use random number generation for weight initialization, optimization and in methods that aim to reduce the overfitting phenomena of these models. One of these methods include node dropout, which has been widely adopted. The method’s internal logic is heavily dictated by a random number generator it utilizes. This study investigated the relationship of quality of randomness and the node dropout regularization in terms of reducing overfitting of neural networks. Our experimentation included five different random number generators, which output were tested for quality of randomness by various statistical tests. These sets of random numbers were then used to dictate the internal logic of a node dropout layer in a neural network model, in four different classification tasks. The impact of data size and relevant hyperparameters were tested, and the overall amount of overfitting, which was compared against the randomness results of a generator. The results suggest that true random number generation in node dropout can be both advantageous and disadvantageous, depending on the dataset and prediction problem at hand. These findings suggest that fitting neural networks in general can be improved by adding random number generation experimentation to modelling.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []