Memristor-based neural networks with weight simultaneous perturbation training

2019 
The training of neural networks involves numerous operations on the weight matrix. If neural networks are implemented in hardware, all weights will be updated in parallel. However, neural networks based on CMOS technology face many challenges in the updating phase of weights. For example, derivation of the activation function and error back propagation make it difficult to be realized at the circuit level, even though the back propagation algorithm is rather efficient and popular in neural networks. In this paper, a novel synaptic unit based on double identical memristors is designed, on the basis of which a new neural network circuit architecture is proposed. The whole network is trained by a hardware-friendly weight simultaneous perturbation (WSP) algorithm. The hardware implementation of neural networks based on WSP algorithm only involves the feedforward circuit and does not require the bidirectional circuit. Furthermore, two forward calculations are merely needed to update all weight matrices for each pattern, which significantly simplifies the weight update circuit and allows simpler and easier implementation of the neural network in hardware. The practicability, utility and simplicity of this scheme are demonstrated by the supervised learning tasks.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    43
    References
    39
    Citations
    NaN
    KQI
    []