An Evolutionary Neuron Model with Dendritic Computation for Classification and Prediction.

2021 
Advances in the understanding of dendrites promote the development of dendritic computation. For decades, the researchers are committed to proposing an appropriate neural model, which may feedback the research on neurons. This paper aims to employ an effective metaheuristic optimization algorithm as the learning algorithms to train the dendritic neuron model (DNM). The powerful ability of the backpropagation (BP) algorithm to train artificial neural networks led us to employ it as a learning algorithm for a conventional DNM, but this also inevitably causes the DNM to suffer from the drawbacks of the algorithm. Therefore, a metaheuristic optimization algorithm, named the firefly algorithm (FA) is adopted to train the DNM (FADNM). Experiments on twelve datasets involving classification and prediction are performed to evaluate the performance. The experimental results and corresponding statistical analysis show that the learning algorithm plays a decisive role in the performance of the DNM. It is worth emphasizing that the FADNM incorporates an invaluable neural pruning scheme to eliminate superfluous synapses and dendrites, simplifying its structure and forming a unique morphology. This simplified morphology can be implemented in hardware through logic circuits, which approximately has no effect on the accuracy of the original model. The hardwareization enables the FADNM to efficiently process high-speed data streams for large-scale data, which leads us to believe that it might be a promising technology to deal with big data.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    47
    References
    0
    Citations
    NaN
    KQI
    []