Multi layer multi objective extreme learning machine

2017 
Fully connected multi layer neural networks such as Deep Boltzmann Machines (DBM) performs better than fully connected single layer neural networks in image classification tasks and has a smaller number of hidden layer neurons than Extreme Learning Machine (ELM) based fully connected multi layer neural networks such as Multi Layer ELM (MLELM) and Hierarchical ELM (H-ELM) However, ML-ELM and H-ELM has a smaller training time than DBM. This paper introduces a fully connected multi layer neural network referred to as Multi Layer Multi Objective Extreme Learning Machine (MLMO-ELM) which uses a multi objective formulation to pass the label and non-linear information in order to learn a network model which has a similar number of hidden layer parameters as DBM and smaller training time than DBM. The experimental results show that MLMO-ELM outperforms DBM, ML-ELM and H-ELM on OCR and NORB datasets.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    5
    Citations
    NaN
    KQI
    []