Improving AMR parsing by exploiting the dependency parsing as an auxiliary task
- PDF / 835,832 Bytes
- 12 Pages / 439.642 x 666.49 pts Page_size
- 8 Downloads / 211 Views
Improving AMR parsing by exploiting the dependency parsing as an auxiliary task Taizhong Wu1 · Junsheng Zhou1 Huilin Zhong1 · Yunfei Long2
· Weiguang Qu1 · Yanhui Gu1 · Bin Li1 ·
Received: 26 March 2020 / Revised: 30 June 2020 / Accepted: 24 September 2020 / © Springer Science+Business Media, LLC, part of Springer Nature 2020
Abstract Abstract meaning representations (AMRs) represent sentence semantics as rooted labeled directed acyclic graphs. Though there is a strong correlation between the AMR graph of a sentence and its corresponding dependency tree, the recent neural network AMR parsers do neglect the exploitation of dependency structure information. In this paper, we explore a novel approach to exploiting dependency structures for AMR parsing. Unlike traditional pipeline models, we treat dependency parsing as an auxiliary task for AMR parsing under the multi-task learning framework by sharing neural network parameters and selectively extracting syntactic representation by the attention mechanism. Particularly, to balance the gradients and focus on the AMR parsing task, we present a new dynamical weighting scheme in the loss function. The experimental results on the LDC2015E86 and LDC2017T10 dataset show that our dependency-auxiliary AMR parser significantly outperforms the baseline and its pipeline counterpart, and demonstrate that the neural AMR parsers can be greatly boosted with the help of effective methods of integrating syntax. Keywords Abstract meaning representations · Multi-task learning · Dependency-auxiliary AMR parser · Neural network
1 Introduction Abstract meaning representation (AMR) [2] is a broad-coverage sentence-level semantic representation. Recently, AMR parsing has attracted much attention. Especially, the neural Junsheng Zhou
[email protected] Weiguang Qu
wgqu [email protected] Yunfei Long [email protected] 1
School of Computer Science and Technology, Nanjing Normal University, Nanjing, China
2
School of computer science and electronic engineering, University of Essex, Colchester, UK
Multimedia Tools and Applications
AMR parsers have achieved state-of-the-art performance. However, compared to the mature dependency parsers, AMR parsing is still in a nascent stage. At present, the best dependency parser can attain a LAS F1 score of about 95% on the public English PTB dataset [12], while the state-of-the-art AMR parser can only reach a smatch F1 score of about 74% on the most recent LDC2017T10 dataset [13, 26]. There is no doubt that there is currently a huge performance gap between the two types of parsers. After analyzing a large number of sentences annotated with both AMRs and dependency trees, we found that the semantic relations in AMRs have strong correlations to the syntactic relations in dependency trees. For example, as shown in Figure 1, the semantic relations “
”, “
respectively corresponds to the syntactic relations “
” and “
” in AMR ”, “
”
and “ ” in dependency tree. However, some recent neural network AMR parsers achieve state-of-the-art performance
Data Loading...