Skip to Main Content
Table 5

Span SRL results without pre-identified predicates on the CoNLL-2005 and CoNLL-2012 data sets. The “PLM” column indicates whether and which pre-trained language model is used, the “SYN” column indicates whether syntax information is employed, and “+E” in the “PLM” column shows that the model leverages the ELMo for pre-trained language model features. [Ens.] is used to specify the ensemble system and [Joint] means joint learning with other tasks.

SystemPLMSYNCoNLL05 WSJCoNLL05 BrownCoNLL12
PRF1PRF1PRF1
(He et al. 2017)  80.2 82.3 81.2 67.6 69.6 68.5 78.6 75.1 76.8 
[Ens.] (He et al. 2017)  82.0 83.4 82.7 69.7 70.5 70.1 80.2 76.6 78.4 
(He et al. 2018a) +E 84.8 87.2 86.0 73.9 78.4 76.1 81.9 84.0 82.9 
 
(Strubell et al. 2018)  84.0 83.2 83.6 73.3 70.6 71.9 81.9 79.6 80.7 
+E 86.7 86.4 86.6 79.0 77.2 78.1 84.0 82.3 83.1 
 
[Joint] (Zhou, Li, and Zhao 2020)  83.7 85.5 84.6 72.0 73.1 72.6 − − − 
+E 85.3 87.7 86.5 76.1 78.3 77.2 − − − 
 
Sequence-based +E 84.4 83.6 84.0 76.5 73.9 75.2 81.7 82.9 82.3 
+GCN Syntax Encoder +E 85.5 84.3 84.9 78.8 73.4 76.0 83.1 82.5 82.8 
+SA-LSTM Syntax Encoder +E 85.0 84.2 84.6 74.9 76.7 75.8 83.1 81.9 82.5 
+Tree-LSTM Syntax Encoder +E 84.7 84.1 84.4 76.2 75.2 75.7 82.7 81.9 82.3 
 
Tree-based +E 85.4 83.6 84.5 76.1 75.1 75.6 83.3 81.9 82.6 
+GCN Syntax Encoder +E 84.5 85.9 85.2 76.7 75.9 76.3 82.9 83.3 83.1 
+SA-LSTM Syntax Encoder +E 85.0 85.0 85.0 77.2 75.0 76.1 83.5 82.7 83.1 
+Tree-LSTM Syntax Encoder +E 85.7 84.1 84.9 75.5 75.9 75.7 83.0 82.4 82.7 
 
Graph-based (2019a) +E 85.2 87.5 86.3 74.7 78.1 76.4 84.9 81.4 83.1 
+Constituent Soft Pruning +E 87.1 85.7 86.4 77.0 76.2 76.6 83.4 83.2 83.3 
+GCN Syntax Encoder +E 86.9 86.5 86.7 77.5 76.3 76.9 84.4 83.0 83.7 
+SA-LSTM Syntax Encoder +E 87.3 85.7 86.5 76.0 77.2 76.6 83.8 83.2 83.5 
+Tree-LSTM Syntax Encoder +E 85.8 86.6 86.2 76.9 76.1 76.5 83.6 82.8 83.2 
SystemPLMSYNCoNLL05 WSJCoNLL05 BrownCoNLL12
PRF1PRF1PRF1
(He et al. 2017)  80.2 82.3 81.2 67.6 69.6 68.5 78.6 75.1 76.8 
[Ens.] (He et al. 2017)  82.0 83.4 82.7 69.7 70.5 70.1 80.2 76.6 78.4 
(He et al. 2018a) +E 84.8 87.2 86.0 73.9 78.4 76.1 81.9 84.0 82.9 
 
(Strubell et al. 2018)  84.0 83.2 83.6 73.3 70.6 71.9 81.9 79.6 80.7 
+E 86.7 86.4 86.6 79.0 77.2 78.1 84.0 82.3 83.1 
 
[Joint] (Zhou, Li, and Zhao 2020)  83.7 85.5 84.6 72.0 73.1 72.6 − − − 
+E 85.3 87.7 86.5 76.1 78.3 77.2 − − − 
 
Sequence-based +E 84.4 83.6 84.0 76.5 73.9 75.2 81.7 82.9 82.3 
+GCN Syntax Encoder +E 85.5 84.3 84.9 78.8 73.4 76.0 83.1 82.5 82.8 
+SA-LSTM Syntax Encoder +E 85.0 84.2 84.6 74.9 76.7 75.8 83.1 81.9 82.5 
+Tree-LSTM Syntax Encoder +E 84.7 84.1 84.4 76.2 75.2 75.7 82.7 81.9 82.3 
 
Tree-based +E 85.4 83.6 84.5 76.1 75.1 75.6 83.3 81.9 82.6 
+GCN Syntax Encoder +E 84.5 85.9 85.2 76.7 75.9 76.3 82.9 83.3 83.1 
+SA-LSTM Syntax Encoder +E 85.0 85.0 85.0 77.2 75.0 76.1 83.5 82.7 83.1 
+Tree-LSTM Syntax Encoder +E 85.7 84.1 84.9 75.5 75.9 75.7 83.0 82.4 82.7 
 
Graph-based (2019a) +E 85.2 87.5 86.3 74.7 78.1 76.4 84.9 81.4 83.1 
+Constituent Soft Pruning +E 87.1 85.7 86.4 77.0 76.2 76.6 83.4 83.2 83.3 
+GCN Syntax Encoder +E 86.9 86.5 86.7 77.5 76.3 76.9 84.4 83.0 83.7 
+SA-LSTM Syntax Encoder +E 87.3 85.7 86.5 76.0 77.2 76.6 83.8 83.2 83.5 
+Tree-LSTM Syntax Encoder +E 85.8 86.6 86.2 76.9 76.1 76.5 83.6 82.8 83.2 
Close Modal

or Create an Account

Close Modal
Close Modal