Dropout and DropConnect for Reliable Neuromorphic Inference under Communication Constraints in Network Connectivity

2019 
Dropout and DropConnect are known as effective methods to improve on the generalization performance of neural networks, by either dropping states of neural units or dropping weights of synaptic connections randomly selected at each time instance throughout the training process. In this paper, we extend on the use of these methods in the design of neuromorphic spiking neural networks (SNN) hardware to improve further on the reliability of inference as impacted by resource constrained errors in network connectivity. Such energy and bandwidth constraints arise for low-power operation in the communication between neural units, which cause dropped spike events due to timeout errors in the transmission. The Dropout and DropConnect processes during training of the network are aligned with a statistical model of the network during inference that accounts for these random errors in the transmission of neural states and synaptic connections. The use of Dropout and DropConnect during training hence allows to simultaneously meet two design objectives: improving robustness of inference to dropped spike events due to timeout communication constraints in network connectivity, while maximizing time-to-decision bandwidth and hence minimizing inference energy in the neuromorphic hardware. Simulations with 5-layer fully connected 784-500-500-500-10 SNN on the MNIST task show a 3.42-fold and 7.06-fold decrease in inference energy at 90% test accuracy, by using Dropout and DropConnect respectively during backpropagation training. Also the simulation with convolutional neural networks on the CIFAR-10 task show a 1.24-fold decrease in inference energy at 60% test accuracy by using Dropout during backpropagation training.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    19
    References
    2
    Citations
    NaN
    KQI
    []