Automated Configuration of Genetic Algorithms by Tuning for Anytime Performance

2022 
Finding the best configuration of algorithms’ hyperparameters for a given optimization problem is an important task in evolutionary computation. We compare in this work the results of four different hyperparameter optimization (HPO) approaches for a family of genetic algorithms (GAs) on 25 diverse pseudo-Boolean optimization (PBO) problems. More precisely, we compare previously obtained results from a grid search with those obtained from three automated configuration techniques: 1) iterated racing; 2) mixed-integer parallel-efficient global optimization (MIP-EGO); and 3) mixed-integer evolutionary strategies. Using two different cost metrics: 1) expected running time (ERT) and 2) the area under the empirical cumulative distribution function (ECDF) curve, we find that in several cases the best configurations with respect to ERT are obtained when using the area under the ECDF curve as the cost metric during the configuration process. Our results suggest that even when interested in ERT performance, it might be preferable to use anytime performance measures for the configuration task. We also observe that tuning for ERT is much more sensitive with respect to the budget that is allocated to the target algorithms.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    52
    References
    0
    Citations
    NaN
    KQI
    []