Improving Classification Performance of Deep Learning Models Using Bio-Inspired Computing

2021 
Deep learning models have paved the way towards generating high-efficiency classification systems for multiple applications. These applications include lung disease classification, electrocardiogram classification, electroencephalogram classification; forest cover classification, etc. All these applications rely on efficient feature selection capabilities of deep learning models. Models like convolutional neural network (CNN), recurrent neural networks (RNNs), long-short-term-memory (LSTM) etc. are used for this purpose. These models tend to evaluate all possible feature combinations via iterative window-based feature processing. Thereby trying to cover indefinite number of feature combinations in order to classify a definite number of features into a definite number of classes. All these models have a stopping-criteria, which depends upon the error rate difference of previous current iteration. If the error rate is less than a particular threshold, and number of iterations are above a certain predefined value, then training of these networks is stopped. This property of deep learning models limits their real-time performance, because training stops even if the accuracy is lower than expected. The reason for this low accuracy is high dimensionality of search space, due to which selection of the most optimum features is skipped. In order to reduce the probability of such conditions, this text proposes a bio-inspired Genetic Algorithm model for accuracy-based feature selection. The selected features are given to different deep learning models like LSTM RNN, and their internal performance is evaluated. Here, heart failure disease dataset from kaggle is used, and it is observed that due to pre-feature selection process, overall accuracy of these models is improved by 10, while precision, recall fMeasure scores are improved by 15 for heart disease data sets. The specificity and sensitivity performance is improved by 20 when compared with RNN and LSTM models individually.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    10
    References
    0
    Citations
    NaN
    KQI
    []