Modal Dependency Parsing via Language Model Priming
2022
The task of modal dependency parsing aims to parse a text into its modal dependency structure, which is a representation for the factuality of events in the text. We design a modal dependency parser that is based on priming pre-trained language models, and evaluate the parser on two data sets. Compared to baselines, we show an improvement of 2.6 in F-score for English and 4.6 for Chinese. To the best of our knowledge, this is also the first work on Chinese modal dependency parsing.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
0
References
0
Citations
NaN
KQI