IBEA-SVM: An Indicator-based Evolutionary Algorithm Based on Pre-selection with Classification Guided by SVM

2019 
Multi-objective optimization has many important applications and becomes a challenging issue in applied science. In typical multi-objective optimization algorithms, such as Indicator-based Evolutionary Algorithm (IBEA), all of parents and offspring need to be evaluated in every generation, and then the better solutions of them are selected as the next generation candidates. This leads to a large amount of calculation and slows down convergence rate for IBEA related applications. Our discovery is that the evaluation of evolutionary algorithm is a binary classification in nature and a meaningful preselection method will accelerate the convergence rate. Therefore this paper presents a novel preselection approach to improve the performance of the IBEA, in which a SVM (Support Vector Machine) classifier is adopted to sort the promising solutions from unpromising solutions and then the newly generated solutions are conversely added as train sample to increase the accuracy of the classifier. Firstly, we proposed an online and asynchronous training method for SVM model with empirical kernel. The initial population is randomly generated among population size, which is used as initial training. In the process of training, SVM classifier is modified and perfected to adapt to the evolutionary algorithm sample. Secondly, the classifier divides all the new generated solutions from the whole solution spaces into promising solutions and unpromising ones. And only the promising ones are forwarded for evaluation. In this way, the evaluation time can be greatly reduced and the solution quality can be obviously improved. Thirdly, the promising and unpromising solutions are labeled as new train samples in next generation to refine classifier model. A number of experiments on benchmark functions validates the proposed approach. The results show that IBEA-SVM can significantly outperform previous works.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    52
    References
    30
    Citations
    NaN
    KQI
    []