Error Tolerance Analysis of Deep Learning Hardware Using a Restricted Boltzmann Machine Toward Low-Power Memory Implementation

2017 
Remarkable hardware robustness of deep learning (DL) is revealed by error injection analyses performed using a custom hardware model implementing parallelized restricted Boltzmann machines (RBMs). RBMs in deep belief networks demonstrate robustness against memory errors during and after learning. Fine-tuning significantly affects the recovery of accuracy for static errors injected to the structural data of RBMs. The memory error tolerance is observable using our hardware networks with fine-graded memory distribution, resulting in reliable DL hardware with low-voltage driven memory suitable to low-power applications.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    12
    Citations
    NaN
    KQI
    []