Table 2.
Results by all models.
ModelENERCoNLL-2003
P (%)R (%)F1(%)P (%)R (%)F1(%)
BiGRU 62.15 68.65 65.28 89.71 90.97 90.24 
BiLSTM-CNN 62.78 69.99 66.45 91.89 91.48 91.62 
BiLSTM-CRF 63.92 70.13 66.94 92.05 91.62 91.78 
SL-BERT 66.37 72.67 69.01 91.93 91.54 91.73 
SL-BART 65.56 73.20 68.98 89.60 91.63 90.60 
Template-BART 65.15 74.28 70.03 90.51 93.34 91.90 
FewNER 64.82 73.78 69.54 90.94 92.59 91.32 
Model-Fusion 64.18 73.92 69.37 89.91 91.89 90.47 
MCP-NER 68.03 75.34 72.45 92.48 93.08 92.72 
 w/o prompt 67.41 74.11 71.28 91.87 92.42 92.05 
 input prefix prompt 67.32 74.47 71.39 92.03 92.79 92.49 
 w/o ML 65.69 72.26 69.59 91.32 91.97 91.66 
ModelENERCoNLL-2003
P (%)R (%)F1(%)P (%)R (%)F1(%)
BiGRU 62.15 68.65 65.28 89.71 90.97 90.24 
BiLSTM-CNN 62.78 69.99 66.45 91.89 91.48 91.62 
BiLSTM-CRF 63.92 70.13 66.94 92.05 91.62 91.78 
SL-BERT 66.37 72.67 69.01 91.93 91.54 91.73 
SL-BART 65.56 73.20 68.98 89.60 91.63 90.60 
Template-BART 65.15 74.28 70.03 90.51 93.34 91.90 
FewNER 64.82 73.78 69.54 90.94 92.59 91.32 
Model-Fusion 64.18 73.92 69.37 89.91 91.89 90.47 
MCP-NER 68.03 75.34 72.45 92.48 93.08 92.72 
 w/o prompt 67.41 74.11 71.28 91.87 92.42 92.05 
 input prefix prompt 67.32 74.47 71.39 92.03 92.79 92.49 
 w/o ML 65.69 72.26 69.59 91.32 91.97 91.66 

Note: Here P, R, and F1 denote Precision, Recall, and F1 score, respectively.

Close Modal

or Create an Account

Close Modal
Close Modal