Leaderboard

Rank Model Fine-tuning Average NKJP-NER CDSC-E CDSC-R CBD PolEmo2.0-IN PolEmo2.0-OUT DYK PSC AR
1 Polish RoBERTa-v2 (large) Polish RoBERTa scripts 88.9 95.8 94.3 95.1 74.3 93.1 84.0 75.4 98.8 89.2
2 HerBERT (large) None 88.4 96.4 94.1 94.9 72.0 92.2 81.8 75.8 98.9 89.1
3 XLM-RoBERTa (large) + NKJP Polish RoBERTa scripts 87.8 94.2 94.2 94.5 72.4 93.1 77.9 77.5 98.6 88.2
4 Polish RoBERTa (large) Polish RoBERTa scripts 87.8 94.5 93.3 94.9 71.1 92.8 82.4 73.4 98.8 88.8
5 XLM-RoBERTa (large) Polish RoBERTa scripts 87.5 94.1 94.4 94.7 70.6 92.4 81.0 72.8 98.9 88.4
6 Polish RoBERTa-v2 (base) Polish RoBERTa scripts 86.8 94.3 94.2 94.7 70.2 90.9 78.5 71.5 98.8 87.7
7 XLM-RoBERTa (large) FT2 86.6 94.6 94.4 94.7 67.9 90.4 79.8 71.6 98.2 87.5
8 HerBERT (base) None 86.3 94.5 94.5 94.0 67.4 90.9 80.4 68.1 98.9 87.7
9 TrelBERT Polish RoBERTa scripts (adapted to transformers library) 86.1 94.4 93.9 93.6 76.1 89.3 78.1 67.4 95.7 86.1
10 Polish RoBERTa (base) Polish RoBERTa scripts 85.3 93.9 94.2 94.0 66.7 90.6 76.3 65.9 98.8 87.8
11 XLM-RoBERTa (large) v0.1.0 84.7 94.6 94.4 94.7 50.7 90.4 79.8 71.6 98.2 87.5
12 HerBERT (base) SimCSE SNLI + MLM 84.5 94.6 94.8 94.2 58.6 88.8 74.9 69.8 98.5 86.8
13 PoLitBert_v32k_linear_50k Polish RoBERTa scripts 83.5 92.3 91.5 92.2 64.0 89.8 76.1 60.2 97.9 87.6
14 PoLitBert_v32k_tri_125k Polish RoBERTa scripts 83.2 93.6 91.7 91.8 62.4 90.3 75.7 59.0 97.4 87.2
15 PoLitBert_v32k_tri_50k Polish RoBERTa scripts 82.5 93.9 91.7 92.1 57.6 88.8 77.9 56.6 96.5 87.7
16 PoLitBert_v32k_linear_125k Polish RoBERTa scripts 82.4 94.0 91.3 91.8 61.1 90.4 78.1 50.8 95.8 88.2
17 Sup-SimCSE SNLI XLM-R (base) SimCSE SNLI 81.7 92.1 93.6 93.1 53.3 88.2 78.5 54.4 97.0 85.4
18 Polbert (cased) v0.1.0 81.7 93.6 93.4 93.8 52.7 87.4 71.1 59.1 98.6 85.2
19 XLM-RoBERTa (base) v0.1.0 81.5 92.1 94.1 93.3 51.0 89.5 74.7 55.8 98.2 85.2
20 Polbert (uncased) v0.1.0 81.4 90.1 93.9 93.5 55.0 88.1 68.8 59.4 98.8 85.4
21 ptaszor 81.4 93.2 92.8 91.6 51.9 87.3 74.7 61.1 93.7 85.9
22 HerBERT - KLEJ v0.1.0 80.5 92.7 92.5 91.9 50.3 89.2 76.3 52.1 95.3 84.5
23 XLM-17 v0.1.0 80.2 91.9 93.7 92.0 44.8 86.3 70.6 61.8 96.3 84.5
24 XLM-100 v0.1.0 79.9 91.6 93.7 91.8 42.5 85.6 69.8 63.0 96.8 84.2
25 Slavic BERT v0.1.0 79.8 93.3 93.7 93.3 43.1 87.1 67.6 57.4 98.3 84.3
26 Multilingual BERT (cased) v0.1.0 79.5 91.4 93.8 92.9 40.0 85.0 66.6 64.2 97.9 83.3
27 Multilingual BERT + Polish SQuAD1.1 v0.1.0 79.4 91.4 93.5 92.8 46.1 83.1 64.0 62.3 97.9 83.2
28 Multilingual BERT + Polish SQuAD2.0 v0.1.0 77.0 91.1 92.7 91.9 36.9 83.5 56.1 60.1 97.6 83.0
29 ELMo jiant 76.7 93.4 89.3 91.1 47.9 87.4 70.6 30.9 93.7 86.2
30 PolBERTa v0.1.0 73.0 87.8 91.2 87.9 41.9 82.5 65.0 34.3 84.5 82.0
31 Sample Submission None 28.5 20.3 59.2 1.4 11.2 27.4 30.2 18.9 30.4 57.2
32 ter e 28.5 20.3 59.2 1.4 11.2 27.4 30.2 18.9 30.4 57.2