Complex Gated Recurrent Neural Networks

Authors:
Moritz Wolter University of Bonn
Angela Yao National University of Singapore

Abstract:

Complex numbers have long been favoured for digital signal processing, yetcomplex representations rarely appear in deep learning architectures. RNNs, widelyused to process time series and sequence information, could greatly benefit fromcomplex representations. We present a novel complex gated recurrent cell, whichis a hybrid cell combining complex-valued and norm-preserving state transitionswith a gating mechanism. The resulting RNN exhibits excellent stability andconvergence properties and performs competitively on the synthetic memory andadding task, as well as on the real-world tasks of human motion prediction.

You may want to know: