Supervised Learning in Spiking Neural Networks with Synaptic Delay-Weight Plasticity

2020 
Abstract Spiking neurons encode information through their spiking temporal patterns. Although the precise spike-timing based encoding scheme has long been recognised, the exact mechanism that underlies the learning of such precise spike-timing in the brain remains an open question. Most of the existing learning methods for spiking neurons are based on synaptic weight adjustment. However, biological evidences suggest that synaptic delays can also be modulated to play an important role in the learning process. This paper investigates the viability of integrating synaptic delay plasticity into supervised learning and proposes a novel learning method that adjusts both the synaptic delays and weights of the learning neurons to make them fire precisely timed spikes, that is referred to as synaptic delay-weight plasticity. Remote Supervised Method (ReSuMe) and Perceptron Based Spiking Neuron Learning Rule (PBSNLR), two representative supervised learning methods, are studied to illustrate how the synaptic delay-weight plasticity works. The performance of the proposed learning method is thoroughly evaluated on synthetic data and is further demonstrated on real-world classification tasks. The experiments show that the synaptic delay-weight learning method outperforms the traditional synaptic weight learning methods in many ways.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    72
    References
    7
    Citations
    NaN
    KQI
    []