Evaluation of Attention Mechanisms on Text to Speech

2021 
Attention mechanisms have been widely used in sequence to sequence tasks. Among those tasks, attention-based neural text to speech synthesis with monotonic property has shown a powerful ability to generate natural speech. This paper introduces three different attention mechanisms designed to utilize the strict monotonic property and evaluates them in a multi-speaker TTS task.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    9
    References
    0
    Citations
    NaN
    KQI
    []