Multi-Task Learning Based Deep Neural Network for Automatic Modulation Classification

2021 
Automatic modulation classification (AMC) is to identify the modulation type of a received signal, which plays a vital role to ensure the physical-layer security for Internet of Things (IoT) networks. Inspired by the great success of deep learning in pattern recognition, the convolutional neural network (CNN) and recurrent neural network (RNN) are introduced into the AMC. In general, there are two popular data formats used by AMC, which are in-phase/quadrature (I/Q) representation and amplitude/phase (A/P) representation, respectively. However, most of AMC algorithms aim at structure innovations, while the differences and characteristics of I/Q and A/P are ignored to analyze. In this paper, lots of popular AMC algorithms are reproduced and evaluated on the same dataset, where the I/Q and A/P are used respectively for comparison. Based on the experimental results, it is found that: (i) CNN-RNN-like algorithms using A/P as input data are superior to those using I/Q at high signal-noise-ratio (SNR), while it has an opposite result in low SNR; (ii) the features extracted from I/Q and A/P are complementary to each other. Motivated by the aforementioned findings, a multi-task learning based deep neural network (MLDNN) is proposed, which effectively fuses I/Q and A/P. In addition, the MLDNN also has a novel backbone, which is made up of three blocks to extract discriminative features, and they are CNN block, bidirectional gated recurrent unit (BiGRU) block and a step attention fusion network (SAFN) block. Different from most of CNN-RNN-like algorithms (i.e. they only use the last step outputs of RNN), all step outputs of BiGRU can be effectively utilized by MLDNN with the help of SAFN. Extensive simulations are conducted to verify that the proposed MLDNN achieves superior performance in the public benchmark.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    1
    Citations
    NaN
    KQI
    []