Content Preserving Text Generation With Attribute Controls

Authors:
Lajanugen Logeswaran University of Michigan
Honglak Lee Google Brain
Samy Bengio Google Brain

Introduction:

In this work, the authors address the problem of modifying textual attributes of sentences.To ensure that the model generates content compatible sentences, the authors introduce a reconstruction loss which interpolates between auto-encoding and back-translation loss components.

Abstract:

In this work, we address the problem of modifying textual attributes of sentences. Given an input sentence and a set of attribute labels, we attempt to generate sentences that are compatible with the conditioning information. To ensure that the model generates content compatible sentences, we introduce a reconstruction loss which interpolates between auto-encoding and back-translation loss components. We propose an adversarial loss to enforce generated samples to be attribute compatible and realistic. Through quantitative, qualitative and human evaluations we demonstrate that our model is capable of generating fluent sentences that better reflect the conditioning information compared to prior methods. We further demonstrate that the model is capable of simultaneously controlling multiple attributes.

You may want to know: