Expressivity of Parameterized and Data-driven Representations in Quality Diversity Search
2021
We consider multi-solution optimization and generative models for the
generation of diverse artifacts and the discovery of novel solutions. In cases
where the domain's factors of variation are unknown or too complex to encode
manually, generative models can provide a learned latent space to approximate
these factors. When used as a search space, however, the range and diversity of
possible outputs are limited to the expressivity and generative capabilities of
the learned model. We compare the output diversity of a quality diversity
evolutionary search performed in two different search spaces: 1) a predefined
parameterized space and 2) the latent space of a variational autoencoder model.
We find that the search on an explicit parametric encoding creates more diverse
artifact sets than searching the latent space. A learned model is better at
interpolating between known data points than at extrapolating or expanding
towards unseen examples. We recommend using a generative model's latent space
primarily to measure similarity between artifacts rather than for search and
generation. Whenever a parametric encoding is obtainable, it should be
preferred over a learned representation as it produces a higher diversity of
solutions.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
21
References
0
Citations
NaN
KQI