Area-Efficient Transposable 6T SRAM for Fast Online Learning in Neuromorphic Processors

2019 
This paper presents a 6T SRAM-based transposable synapse memory aiming to improve online learning performance of neuromorphic processors at the minimum area cost. While a custom 8T SRAM was used in the previous transposable synapse memory, the proposed one uses 6T SRAM, which leads to substantial area savings. Based on the proposed hierarchical word line structure with row transition multiplexer, both row-wise and column-wise accesses are made possible in an integrated SRAM array. A 64K-synapse memory employing the proposed scheme is implemented in a 28nm CMOS technology, which has 17.7% area overhead compared to the non-transposable 6T synapse memory; 50.0% area savings compared to the transposable 8T synapse memory, and 35.5% area savings compared to the previous transposable 6T synapse memory. The estimated performance gain for online spike-timing-dependent plasticity learning using MNIST dataset is $6.7 \times$ compared to the non-transposable synapse memory.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    7
    References
    2
    Citations
    NaN
    KQI
    []