ComPreEND: Computation Pruning through Predictive Early Negative Detection for ReLU in a Deep Neural Network Accelerator

2021 
A vast amount of activation values of DNNs are zeros due to ReLU (Rectified Linear Unit), which is one of the most common activation functions used in modern neural networks. Since ReLU outputs zero for all negative inputs, the inputs to ReLU do not need to be determined exactly as long as they are negative. However, many accelerators usually do not consider such aspects of DNNs, losing a huge amount of opportunities for speedups and energy savings. To exploit such opportunities, we propose early negative detection (END), a computation pruning technique that detects the negative results at an early stage. The key to the early negative detection is the adoption of inverted twos complement representation for filter parameters. This ensures that as soon as the intermediate results become negative, the final results are guaranteed to be negative. Upon detection, the remaining computation can be skipped and the following ReLU output can be simply set to zero. We also propose a DNN accelerator architecture (ComPreEND) that takes advantage of such skipping. ComPreEND with END significantly improves both the energy efficiency and the performance according to the evaluation.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []