A Multi-Strategy Whale Optimization Algorithm and Its Application

2022 
Abstract Whale Optimization Algorithm (WOA) is a key tool for solving complex engineering optimization problems, aiming at adjusting important parameters to satisfy constraints and optimal objectives. WOA has a simple structure, few parameters, high search capability, and easy implementation. However, it suffers from the same problems as other metaheuristic algorithms of being prone to local optima and slow convergence, for which the Multi-Strategy Whale Optimization Algorithm (MSWOA) is proposed. Four strategies are introduced in MSWOA. Firstly, a highly randomized chaotic logistic map is used to generate a high-quality initial population. Secondly, exploitation and exploration are enhanced by setting adaptive weights and dynamic convergence factors. Further, a Levy flight mechanism is introduced to maintain the population diversity in each iteration. Finally, the Evolutionary Population Dynamics (EPD) mechanism is introduced to improve the efficiency of search agents in finding the optimum. Another problem lies in the Semi-Supervised Extreme Learning Machine (SSELM) based on manifold regularization is an effective classification and regression model, but the random generation of input weights and hidden layer thresholds and the grid selection of hyperparameters lead to unsatisfactory classification performance. To this end, we developed the MSWOA-SSELM model, optimally selected the parameters of SSLEM using MSWOA, and applied it to logging layer recognition, which effectively improved the accuracy of logging interpretation. By comparing the experiments with 14 swarm intelligence algorithms on 18 benchmark test functions, the CEC2017 benchmark suite, and an engineering application problem, the experimental results show that MSWOA is significantly superior and effective in solving global optimization problems. Finally, the proposed MSWOA-SSELM is applied in three wells and outperforms other classification models in terms of Accuracy (ACC), Root Mean Square Error (RMSE), and Mean Absolute Error (MAE). It obtained the best results with 96.2567% ACC, MAE of 0.0749, and RMSE of 0.3870.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    57
    References
    0
    Citations
    NaN
    KQI
    []