partial-FORCE: A fast and robust online training method for recurrent neural networks

2021 
Recurrent neural networks (RNNs) are helpful tools for modeling dynamical systems by neuronal populations, but efficiently training RNNs has been a challenging topic. In recent years, a recursive least squares (RLS) based method for modifying all the recurrent connections, called the full-Force method, has been gaining attention as a fast and robust online training rule. This method introduces a second network (called the teacher reservoir) during training to provide suitable target dynamics to all the hidden units of the task-performing network (called the student network). Thanks to the RLS-based approach, the full-FORCE method can be applied to training continuous-time networks and spiking neural networks. In this study, we propose a generalized version of the full-FORCE method: the partial-FORCE method. In the proposed method, only part of the student network neurons (called supervised neurons) is supervised by only part of the teacher reservoir neurons (called supervising neurons). As a result of this relaxation, the size of the student network and that of the teacher reservoir can be different, which is biologically plausible as a possible model of the memory transfer in the brain. Furthermore, we numerically show that the partial-FORCE method converges faster and is more robust against variations in parameter values and initial conditions than the full-FORCE method, even without the price of computational cost.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    0
    Citations
    NaN
    KQI
    []