LATIM: Loading-Aware Offline Training Method for Inverter-based Memristive Neural Networks

2021 
In this brief, we present a high accuracy training method for inverter-based memristive neural networks ( IM -NNs). The method, which relies on accurate modeling of the circuit element characteristics, is called LATIM (Loading-Aware offline Training method for Inverter-based Memristive NNs). In LATIM, an approximation method is proposed to estimate the effective load of the memristive crossbar (as the synapses) while two NNs are utilized to predict the voltage transfer characteristic (VTC) of the inverters (as the activation functions). Efficacy of the proposed method is compared with the recent offline training methods for IM -NNs, called PHAX and RIM. Simulation results reveal that LATIM can predict the output voltage of the IM -NNs, on average, by $14\times $ ( $6\times $ ) and $29\times $ ( $4\times $ ) smaller error for the MNIST and Fashion MNIST datasets, respectively, compared to those of PHAX (RIM) method. In addition, IM -NNs trained by LATIM consume, on average, 62% and 53% lower energy compared to PHAX and RIM methods due to proper sizing of the inverters.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    19
    References
    2
    Citations
    NaN
    KQI
    []