Skip to Main Content
Table 2.
NER model results (strict).
Disease & diagnosisImaging examinationLaboratory examinationSurgeryDrugAnatomic partTotal
BiLSTM-CRF 0.660 0.787 0.711 0.709 0.862 0.807 0.767 
BERT_BC 0.838 0.840 0.745 0.856 0.873 0.869 0.836 
W_BC 0.846 0.844 0.749 0.865 0.870 0.872 0.858 
W_FPT_BC 0.849 0.843 0.738 0.892 0.899 0.878 0.866 
W_FPT_BC+pseudo labelling 0.860 0.843 0.758 0.910 0.895 0.881 0.871 
W_FPT_BC+domain dictionary+pseudo labelling 0.850 0.844 0.762 0.929 0.913 0.881 0.873 
RWL_BC+domain dictionary+ pseudo labelling 0.847 0.849 0.758 0.918 0.914 0.884 0.873 
RWL+domain dictionary+pseudo labelling 0.854 0.855 0.765 0.931 0.922 0.893 0.882 
Fusion model 0.872 0.859 0.792 0.940 0.923 0.891 0.887 
Disease & diagnosisImaging examinationLaboratory examinationSurgeryDrugAnatomic partTotal
BiLSTM-CRF 0.660 0.787 0.711 0.709 0.862 0.807 0.767 
BERT_BC 0.838 0.840 0.745 0.856 0.873 0.869 0.836 
W_BC 0.846 0.844 0.749 0.865 0.870 0.872 0.858 
W_FPT_BC 0.849 0.843 0.738 0.892 0.899 0.878 0.866 
W_FPT_BC+pseudo labelling 0.860 0.843 0.758 0.910 0.895 0.881 0.871 
W_FPT_BC+domain dictionary+pseudo labelling 0.850 0.844 0.762 0.929 0.913 0.881 0.873 
RWL_BC+domain dictionary+ pseudo labelling 0.847 0.849 0.758 0.918 0.914 0.884 0.873 
RWL+domain dictionary+pseudo labelling 0.854 0.855 0.765 0.931 0.922 0.893 0.882 
Fusion model 0.872 0.859 0.792 0.940 0.923 0.891 0.887 

Note: The terms of BERT_BC, W_BC, W_FPT_BC, RWL, and RWL_BC refer to the BERT_BiLSTM-CRF, WoBERT_BiLSTM-CRF, WoBERT_FurtherPretraing_BiLSTM-CRF, RoBERTa-wwm-large, and RoBERTa-wwm-large-BiLSTM-CRF model, respectively.

Close Modal

or Create an Account

Close Modal
Close Modal