The results of the supervised approach to readability in terms of accuracy, weighted precision, weighted recall, and weighted F1-score for the three neural network classifiers and methods from the literature.
Measure/Data set . | WeeBit . | OneStopEnglish . | Newsela . | Slovenian SB . |
---|---|---|---|---|
Filighera et al. (2019) accuracy | 0.8130 | – | – | – |
Xia et al. (2016) accuracy | 0.8030 | – | – | – |
SVM-BF (Deutsch et al. 2020) F1 | 0.8381 | – | 0.7627 | – |
SVM-HF (Deutsch et al., 2020) F1 | – | – | 0.8014 | – |
Vajjala et al. (2018) accuracy | – | 0.7813 | – | – |
BERT accuracy | 0.8573 | 0.6738 | 0.7573 | 0.4545 |
BERT precision | 0.8658 | 0.7395 | 0.7510 | 0.4736 |
BERT recall | 0.8573 | 0.6738 | 0.7573 | 0.4545 |
BERT F1 | 0.8581 | 0.6772 | 0.7514 | 0.4157 |
BERT QWK | 0.9527 | 0.7077 | 0.9789 | 0.8855 |
HAN accuracy | 0.7520 | 0.7872 | 0.8138 | 0.4887 |
HAN precision | 0.7534 | 0.7977 | 0.8147 | 0.4866 |
HAN recall | 0.7520 | 0.7872 | 0.8138 | 0.4887 |
HAN F1 | 0.7520 | 0.7888 | 0.8101 | 0.4847 |
HAN QWK | 0.8860 | 0.8245 | 0.9835 | 0.8070 |
BiLSTM accuracy | 0.7743 | 0.6875 | 0.7111 | 0.5277 |
BiLSTM precision | 0.7802 | 0.7177 | 0.6910 | 0.5239 |
BiLSTM recall | 0.7743 | 0.6875 | 0.7111 | 0.5277 |
BiLSTM F1 | 0.7750 | 0.6920 | 0.6985 | 0.5219 |
BiLSTM QWK | 0.9060 | 0.7230 | 0.9628 | 0.7980 |
Measure/Data set . | WeeBit . | OneStopEnglish . | Newsela . | Slovenian SB . |
---|---|---|---|---|
Filighera et al. (2019) accuracy | 0.8130 | – | – | – |
Xia et al. (2016) accuracy | 0.8030 | – | – | – |
SVM-BF (Deutsch et al. 2020) F1 | 0.8381 | – | 0.7627 | – |
SVM-HF (Deutsch et al., 2020) F1 | – | – | 0.8014 | – |
Vajjala et al. (2018) accuracy | – | 0.7813 | – | – |
BERT accuracy | 0.8573 | 0.6738 | 0.7573 | 0.4545 |
BERT precision | 0.8658 | 0.7395 | 0.7510 | 0.4736 |
BERT recall | 0.8573 | 0.6738 | 0.7573 | 0.4545 |
BERT F1 | 0.8581 | 0.6772 | 0.7514 | 0.4157 |
BERT QWK | 0.9527 | 0.7077 | 0.9789 | 0.8855 |
HAN accuracy | 0.7520 | 0.7872 | 0.8138 | 0.4887 |
HAN precision | 0.7534 | 0.7977 | 0.8147 | 0.4866 |
HAN recall | 0.7520 | 0.7872 | 0.8138 | 0.4887 |
HAN F1 | 0.7520 | 0.7888 | 0.8101 | 0.4847 |
HAN QWK | 0.8860 | 0.8245 | 0.9835 | 0.8070 |
BiLSTM accuracy | 0.7743 | 0.6875 | 0.7111 | 0.5277 |
BiLSTM precision | 0.7802 | 0.7177 | 0.6910 | 0.5239 |
BiLSTM recall | 0.7743 | 0.6875 | 0.7111 | 0.5277 |
BiLSTM F1 | 0.7750 | 0.6920 | 0.6985 | 0.5219 |
BiLSTM QWK | 0.9060 | 0.7230 | 0.9628 | 0.7980 |