Fast Training of Recurrent Neural Networks by the Recursive Least Squares Method

1997 
In this work a novel approach to the training of recurrent neural nets is presented. The algorithm exploits the separability of each neuron into its linear and nonlinear part. Each Iteration of the learning consists of two steps: first the descent of the error functional in the space of the linear outputs of the neurons is performed (descent in the neuron space);then the weights are updated by solving a linear system with a Recursive Least Squares technique.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    1
    Citations
    NaN
    KQI
    []