Self-Organizing Radial Basis Function Neural Network Using Accelerated Second-Order Learning Algorithm

2022 
Abstract Gradient-based algorithms are commonly used for training radial basis function neural network (RBFNN). However, it is still difficult to avoid vanishing gradient to improve the learning performance in the training process. For this reason, in this paper, an accelerated second-order learning (ASOL) algorithm is developed to train RBFNN. First, an adaptive expansion and pruning mechanism (AEPM) of gradient space, based on the integrity and orthogonality of hidden neurons, is designed. Then, the effective gradient information is constantly added to gradient space and the redundant gradient information is eliminated from gradient space. Second, with AEPM, the neurons are generated or pruned accordingly. In this way, a self-organizing RBFNN (SORBFNN) which reduces the structure complexity and improves the generalization ability is obtained. Then, the structure and parameters in the learning process can be optimized by the proposed ASOL-based SORBFNN (ASOL-SORBFNN). Third, some theoretical analyses including the efficiency of the proposed AEPM on avoiding the vanishing gradient and the stability of SORBFNN in the process of structural adjustment are given, then the successful application of the proposed ASOL-SORBFNN is guaranteed. Finally, to illustrate the advantages of the proposed ASOL-SORBFNN, several experimental studies are examined. By comparing with other existing approaches, the results show that ASOL-SORBFNN performs well in terms of both learning speed and prediction accuracy.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    46
    References
    0
    Citations
    NaN
    KQI
    []