Why design reliable chips when faulty ones are even better

2013 
Moore's law, the driving force behind the computing technology revolution is widely expected to face limiting, if not disruptive, hurdles within the next 5 years or so, owing in part to its inability to cope with the errors arising from device variations and perturbations as well as the accompanying increased power density in deep nanoscale CMOS regime. To overcome these twin hurdles and in a sharp contrast to that of conventional research based on von Neumann's legacy of designing reliable hardware from unreliable components, we adopt a radically different philosophy of designing unreliable hardware from reliable or unreliable components, wherein we take advantage of the inherent- or induced- errors in the circuits to achieve significant cost (in terms of size, energy, design, performance, manufacturing and verification) savings. Our approach not only is sensitive to the value of information, thereby producing good enough designs of lesser cost but also opens an entirely new design space where perceptual- and statistical-limitations (and hence, the desired quality) is a dimension that can be traded off.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    24
    References
    4
    Citations
    NaN
    KQI
    []