Nonlinearized Relevance Propagation.

2018 
We propose nonlinearized relevance propagation (NRP), an improved method of exploring deep neural networks (DNN). This method derives from well-known layer-wise relevance propagation (LRP), which employs a linear process to explain a DNN model’s outputs ordinarily. Although the nonlinear functions are widely used by most of the neural network models, to the best of our knowledge, they have not been employed in the LRP for DNN models. In this paper, we apply NRP to the attentive pooling answer selection model and compare the performance of NRP to sensitivity analysis (SA) and LRP of the linear setting. The result shows exploiting nonlinear functions in LRP can help inputs retain more information of importance than SA. The contribution of this work is extending the use of relevance propagation in understanding inner workings of complicated DNN models.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    27
    References
    1
    Citations
    NaN
    KQI
    []