language-icon Old Web
English
Sign In

Stochastic computing

Stochastic computing is a collection of techniques that represent continuous values by streams of random bits. Complex computations can then be computed by simple bit-wise operations on the streams. Stochastic computing is distinct from the study of randomized algorithms. Stochastic computing is a collection of techniques that represent continuous values by streams of random bits. Complex computations can then be computed by simple bit-wise operations on the streams. Stochastic computing is distinct from the study of randomized algorithms. Suppose that p , q ∈ [ 0 , 1 ] {displaystyle p,qin } is given, and we wish to compute p × q {displaystyle p imes q} . Stochastic computing performs this operation using probability instead of arithmetic. Specifically, suppose that there are two random, independent bit streams called stochastic numbers (i.e. Bernoulli processes), where the probability of a one in the first stream is p {displaystyle p} , and the probability in the second stream is q {displaystyle q} . We can take the logical AND of the two streams. The probability of a one in the output stream is p q {displaystyle pq} . By observing enough output bits and measuring the frequency of ones, it is possible to estimate p q {displaystyle pq} to arbitrary accuracy. The operation above converts a fairly complicated computation (multiplication of p {displaystyle p} and q {displaystyle q} ) into a series of very simple operations (evaluation of a i ∧ b i {displaystyle a_{i}land b_{i}} ) on random bits. More generally speaking, stochastic computing represents numbers as streams of random bits and reconstructs numbers by calculating frequencies. The computations are performed on the streams and translate complicated operations on p {displaystyle p} and q {displaystyle q} into simple operations on their stream representations. (Because of the method of reconstruction, devices that perform these operations are sometimes called stochastic averaging processors.) In modern terms, stochastic computing can be viewed as an interpretation of calculations in probabilistic terms, which are then evaluated with a Gibbs sampler. It can also be interpreted as a hybrid analog/digital computer. Stochastic computing was first introduced in a pioneering paper by John von Neumann in 1953. However, thetheory could not be fully developed until advances in computing of the 1960s,mostly through a series of simultaneous and parallel efforts in the USand the UK.By the late 1960s, attention turned to the design ofspecial-purpose hardware to perform stochastic computation. A hostof these machines were constructed between 1969 and 1974; RASCELis pictured in this article. Despite the intense interest in the 1960s and 1970s, stochasticcomputing ultimately failed to compete with more traditional digitallogic, for reasons outlined below. The first (and last)International Symposium on Stochastic Computingtook place in 1978; active research in the area dwindled over the nextfew years. Although stochastic computing declined as a general method ofcomputing, it has shown promise in several applications. Research hastraditionally focused on certain tasks in machine learning andcontrol.Somewhat recently, interest has turned towards stochasticdecoding, which applies stochastic computing to the decoding of errorcorrecting codes. More recently, stochastic circuits have been successfully used in image processing tasks such as edge detection and image thresholding.

[ "Computation", "Binary number", "Artificial neural network", "Logic gate", "Electronic circuit" ]
Parent Topic
Child Topic
    No Parent Topic