Skip to Main Content
ModelsSimilarity (ρ)Directionality (Acc)Classification (Acc)
 Hyperlex wbless bibless bless leds eval weeds 
Vanilla 0.1352 0.5101 0.4894 0.1115 0.7164 0.2404 0.5335 
 
Retrofitting 0.1718 0.5603 0.5469 0.1440 0.7337 0.2648 0.5846 
Counterfitting 0.3440 0.6196 0.6071 0.1851 0.7344 0.3296 0.6342 
LEAR 0.4346 0.6779 0.6683 0.2815 0.7413 0.3623 0.6926 
 
LexSub 0.5327 0.8228 0.7252 0.5884 0.9290 0.4359 0.9101 
(b) Hypernymy evaluation results for baselines and LexSub trained with lexical resource from LEAR. 
ModelsSimilarity (ρ)Directionality (Acc)Classification (Acc)
 Hyperlex wbless bibless bless leds eval weeds 
Vanilla 0.1352 0.5101 0.4894 0.1115 0.7164 0.2404 0.5335 
 
Retrofitting 0.1718 0.5603 0.5469 0.1440 0.7337 0.2648 0.5846 
Counterfitting 0.3440 0.6196 0.6071 0.1851 0.7344 0.3296 0.6342 
LEAR 0.4346 0.6779 0.6683 0.2815 0.7413 0.3623 0.6926 
 
LexSub 0.5327 0.8228 0.7252 0.5884 0.9290 0.4359 0.9101 
(b) Hypernymy evaluation results for baselines and LexSub trained with lexical resource from LEAR. 
Close Modal

or Create an Account

Close Modal
Close Modal