A Mahalanobis Distance-Based Fitness Approximation Method for Estimation of Distribution Algorithms in Solving Expensive Optimization Problems

2019 
Fitness approximation methods have been widely employed in evolutionary algorithms to reduce the number of fitness evaluations in solving expensive optimization problems. As a simple and efficient approximation approach, k-nearest neighbors (kNN) estimates the fitness value of an unknown solution by combining the fitness values of its nearest neighbors according to a similarity measure. kNN generally adopts the Euclidean distance as the similarity measure, which may limit its performance as the solution distribution information is underutilized in the approximation process. Aiming at this issue, this study proposes a Mahalanobis distance-based k-nearest neighbors (MkNN) to improve the approximation accuracy by utilizing the distribution information. Compared to the Euclidean distance-based kNN (EkNN), MkNN adopts the Mahalanobis distance to measure the similarity between solutions, which is capable of capturing the distribution information of solutions and thus can improve the approximation efficiency. Furthermore, considering that the main idea of estimation of distribution algorithms (EDAs) is also to learn the distribution information of solutions, the proposed MkNN as well as EkNN are combined with an EDA and two new algorithms named EDA-MkNN and EDA-EkNN, respectively, are developed for expensive optimization. The performances of EDA-MkNN and EDA-EkNN were comprehensively tested on a set of 28 benchmark functions and compared with that of a typical EDA. Experimental results demonstrate that MkNN and EkNN could effectively improve the performance of EDA in solving different kinds of expensive optimization problems and MkNN can have an edge over EkNN on condition that the distribution information is well captured.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []