Enhancing evolutionary multitasking optimization by leveraging inter-task knowledge transfers and improved evolutionary operators

2023 
It is inefficient and time-consuming to begin the search from scratch for each optimization task. Evolutionary multitasking optimization (EMTO) handles multiple tasks simultaneously, aiming at improving the solving quality of every task via the evolutionary algorithm (EA) and inter-task knowledge transfer. Thus, suitable evolutionary operators and inter-task effective knowledge transfer are two key factors for the success of EMTO. As one of the representative EMTO algorithms, the multifactorial evolutionary algorithm (MFEA) has attracted a lot of attention. However, MFEA has suffered from the issue of premature convergence and negative knowledge transfer among tasks with a low correlation. To handle these issues, this article enhances MFEA with two proposed strategies, namely a carefully-designed opposition-based learning (OBL) strategy and a carefully-designed differential evolution (DE) strategy, named MFDE-OBL for short. Both the proposed OBL and the proposed DE contain an inter-task strategy and an intra-task strategy. To improve the effectiveness of knowledge transfer, the inter-task OBL strategy learns a linear subspace mapping among task/tasks’ subpopulations to transfer different search scales among tasks, while the inter-task DE strategy uses genetic information from another task to improve the population diversity with different scales and directions. Besides, the intra-task generalized-opposite-point-based OBL is used to enhance the global search ability, while the intra-task DE strategy consists of two complementary DE strategies to maintain a good balance between exploitation and exploration. Finally, the proposed algorithm is tested on both single-objective and multi-objective multi-tasking test suites. Experimental results have shown the efficiency and effectiveness of the proposed algorithm compared with both classical and state-of-the-art algorithms.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []