Rank | Team | Model | Avg | LID | POS | NER | SA | MT | More | ||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
- |
LinCE Organizers
LinCE Organizers
|
mBERT | - | 95.12 | 91.68 | 67.33 | 56.43 | - |
...
mBERT
|
||||||||||||||||||||
- |
LinCE Organizers
LinCE Organizers
|
BERT base, cased | - | 94.89 | 91.97 | 65.02 | 58.40 | - |
...
BERT base, cased
|
||||||||||||||||||||
- |
LinCE Organizers
LinCE Organizers
|
ELMo small | - | 93.95 | 91.53 | 59.35 | 52.88 | - |
...
ELMo small
|
||||||||||||||||||||
- |
GeoReactor
GeoReactor
|
Spanish-based BERT | - | - | - | - | 56.47 | - |
...
Spanish-based BERT
|
||||||||||||||||||||
- |
Remy
Remy
|
Fine-tuned M-BERT | - | - | - | - | 57.54 | - |
...
Fine-tuned M-BERT
|
||||||||||||||||||||
- |
UH & Salesforce Research
UH & Salesforce Research
|
Char2subword mBERT | - | 95.48 | 92.55 | 68.05 | 59.07 | - |
...
Char2subword mBERT
|
||||||||||||||||||||
- |
HKUST
HKUST
|
HME-Ensemble | - | - | 93.04 | 69.93 | - | - |
...
HME-Ensemble
|
||||||||||||||||||||
- |
HKUST
HKUST
|
HME | - | - | 92.60 | 67.66 | - | - |
...
HME
|
||||||||||||||||||||
- |
HKUST
HKUST
|
XLM-R Base | - | - | 93.98 | 68.60 | - | - |
...
XLM-R Base
|
||||||||||||||||||||
- |
HKUST
HKUST
|
XLM-R Large | - | - | 94.38 | 72.01 | - | - |
...
XLM-R Large
|
||||||||||||||||||||
- |
HKUST
HKUST
|
XLM-MLM-100 | - | - | 93.07 | 68.62 | - | - |
...
XLM-MLM-100
|
||||||||||||||||||||
- |
IBM
IBM
|
mBERT | - | - | - | - | 57.27 | - |
...
mBERT
|
||||||||||||||||||||
- |
UBA & UNC
UBA & UNC
|
RoBERTuito | - | - | - | - | 60.57 | - |
...
RoBERTuito
|
||||||||||||||||||||
- |
anonymous
anonymous
|
XLMR_multi-labels | - | 96.49 | 94.48 | 72.72 | 62.21 | - |
...
XLMR_multi-labels
|
Rank | Team | Model | Avg | SPA-ENG | HIN-ENG | NEP-ENG | MSA-EA | More | ||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 |
anonymous
anonymous
|
XLMR_multi-labels | 96.49 | 98.64 | 97.07 | 96.74 | 93.51 |
...
XLMR_multi-labels
|
||||||||||||||||||
2 |
UH & Salesforce Research
UH & Salesforce Research
|
Char2subword mBERT | 95.48 | 98.33 | 96.23 | 96.19 | 91.19 |
...
Char2subword mBERT
|
||||||||||||||||||
3 |
LinCE Organizers
LinCE Organizers
|
mBERT | 95.12 | 98.36 | 94.24 | 96.32 | 91.55 |
...
mBERT
|
||||||||||||||||||
4 |
LinCE Organizers
LinCE Organizers
|
BERT base, cased | 94.89 | 98.35 | 96.40 | 96.46 | 88.36 |
...
BERT base, cased
|
||||||||||||||||||
5 |
LinCE Organizers
LinCE Organizers
|
ELMo small | 93.95 | 97.93 | 95.43 | 95.90 | 86.53 |
...
ELMo small
|
||||||||||||||||||
- |
Remy
Remy
|
Fine-tuned M-BERT | - | 97.27 | - | - | - |
...
Fine-tuned M-BERT
|
||||||||||||||||||
- |
Dana, Sara & Rasmus
Dana, Sara & Rasmus
|
Viterbi Decoding | - | 92.23 | - | - | - |
...
Viterbi Decoding
|
||||||||||||||||||
- |
IRLab@IITBHU
IRLab@IITBHU
|
BERT | - | - | 96.14 | - | - |
...
BERT
|
||||||||||||||||||
- |
IRLab@IITBHU
IRLab@IITBHU
|
BERT | - | 87.34 | - | - | - |
...
BERT
|
||||||||||||||||||
- |
IRLab@IITBHU
IRLab@IITBHU
|
GloVe | - | - | 89.14 | - | - |
...
GloVe
|
||||||||||||||||||
- |
LLAMA
LLAMA
|
XLM Baseline | - | - | 95.66 | - | - |
...
XLM Baseline
|
||||||||||||||||||
- |
Igor Sterner
Igor Sterner
|
Any-English (AnE) | - | 98.44 | 96.86 | 96.66 | - |
...
Any-English (AnE)
|
Rank | Team | Model | Avg | SPA-ENG | HIN-ENG | More | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 |
anonymous
anonymous
|
XLMR_multi-labels | 94.48 | 97.22 | 91.73 |
...
XLMR_multi-labels
|
||||||||||||||
2 |
HKUST
HKUST
|
XLM-R Large | 94.38 | 97.18 | 91.59 |
...
XLM-R Large
|
||||||||||||||
3 |
HKUST
HKUST
|
XLM-R Base | 93.98 | 96.96 | 91.00 |
...
XLM-R Base
|
||||||||||||||
4 |
HKUST
HKUST
|
XLM-MLM-100 | 93.07 | 97.04 | 89.10 |
...
XLM-MLM-100
|
||||||||||||||
5 |
HKUST
HKUST
|
HME-Ensemble | 93.04 | 96.78 | 89.30 |
...
HME-Ensemble
|
||||||||||||||
6 |
HKUST
HKUST
|
HME | 92.60 | 96.66 | 88.55 |
...
HME
|
||||||||||||||
7 |
UH & Salesforce Research
UH & Salesforce Research
|
Char2subword mBERT | 92.55 | 96.88 | 88.23 |
...
Char2subword mBERT
|
||||||||||||||
8 |
LinCE Organizers
LinCE Organizers
|
BERT base, cased | 91.97 | 96.92 | 87.02 |
...
BERT base, cased
|
||||||||||||||
9 |
LinCE Organizers
LinCE Organizers
|
mBERT | 91.68 | 97.07 | 86.30 |
...
mBERT
|
||||||||||||||
10 |
LinCE Organizers
LinCE Organizers
|
ELMo small | 91.53 | 96.34 | 86.71 |
...
ELMo small
|
||||||||||||||
- |
Remy
Remy
|
Fine-tuned M-BERT | - | 97.13 | - |
...
Fine-tuned M-BERT
|
||||||||||||||
- |
UBA & UNC
UBA & UNC
|
RoBERTuito | - | 97.17 | - |
...
RoBERTuito
|
Rank | Team | Model | Avg | SPA-ENG | HIN-ENG | MSA-EA | More | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 |
anonymous
anonymous
|
XLMR_multi-labels | 72.72 | 68.84 | 81.16 | 68.17 |
...
XLMR_multi-labels
|
||||||||||||||||
2 |
HKUST
HKUST
|
XLM-R Large | 72.01 | 69.55 | 80.70 | 65.78 |
...
XLM-R Large
|
||||||||||||||||
3 |
HKUST
HKUST
|
HME-Ensemble | 69.93 | 65.11 | 75.97 | 68.71 |
...
HME-Ensemble
|
||||||||||||||||
4 |
HKUST
HKUST
|
XLM-MLM-100 | 68.62 | 64.16 | 74.49 | 67.22 |
...
XLM-MLM-100
|
||||||||||||||||
5 |
HKUST
HKUST
|
XLM-R Base | 68.60 | 64.95 | 75.72 | 65.13 |
...
XLM-R Base
|
||||||||||||||||
6 |
UH & Salesforce Research
UH & Salesforce Research
|
Char2subword mBERT | 68.05 | 64.65 | 73.38 | 66.13 |
...
Char2subword mBERT
|
||||||||||||||||
7 |
HKUST
HKUST
|
HME | 67.66 | 63.06 | 73.78 | 66.14 |
...
HME
|
||||||||||||||||
8 |
LinCE Organizers
LinCE Organizers
|
mBERT | 67.33 | 64.05 | 72.57 | 65.39 |
...
mBERT
|
||||||||||||||||
9 |
LinCE Organizers
LinCE Organizers
|
BERT base, cased | 65.02 | 61.15 | 74.46 | 59.44 |
...
BERT base, cased
|
||||||||||||||||
10 |
LinCE Organizers
LinCE Organizers
|
ELMo small | 59.35 | 52.58 | 68.79 | 56.68 |
...
ELMo small
|
||||||||||||||||
- |
Remy
Remy
|
Fine-tuned M-BERT | - | 63.85 | - | - |
...
Fine-tuned M-BERT
|
||||||||||||||||
- |
IBM
IBM
|
indicBERT | - | - | 80.09 | - |
...
indicBERT
|
||||||||||||||||
- |
UBA & UNC
UBA & UNC
|
RoBERTuito | - | 68.50 | - | - |
...
RoBERTuito
|
||||||||||||||||
- |
Anonymous
Anonymous
|
Roberta-large | - | - | 80.47 | - |
...
Roberta-large
|
||||||||||||||||
- |
Amazon IML
Amazon IML
|
IndicBERT | - | - | 80.65 | - |
...
IndicBERT
|
||||||||||||||||
- |
Amazon IML
Amazon IML
|
Comix (Ph) | - | - | 82.16 | - |
...
Comix (Ph)
|
||||||||||||||||
- |
Amazon IML
Amazon IML
|
Comix (WSG) | - | - | 81.78 | - |
...
Comix (WSG)
|
||||||||||||||||
- |
Amazon IML
Amazon IML
|
Comix (DKGA) | - | - | 81.42 | - |
...
Comix (DKGA)
|
||||||||||||||||
- |
Amazon IML
Amazon IML
|
Comix DKGA+WSG Ensemble | - | - | 82.40 | - |
...
Comix DKGA+WSG Ensemble
|
||||||||||||||||
- |
Amazon IML
Amazon IML
|
Comix (DKGA+WSG) | - | - | 81.62 | - |
...
Comix (DKGA+WSG)
|
||||||||||||||||
- |
Amazon IML
Amazon IML
|
Comix DKGA+WSG+Phonetics Ensemble | - | - | 83.07 | - |
...
Comix DKGA+WSG+Phonetics Ensemble
|
||||||||||||||||
- |
a
a
|
bert | - | - | 69.97 | - |
...
bert
|
Rank | Team | Model | Avg | SPA-ENG | More | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 |
anonymous
anonymous
|
XLMR_multi-labels | 62.21 | 62.21 |
...
XLMR_multi-labels
|
||||||||||||
2 |
UBA & UNC
UBA & UNC
|
RoBERTuito | 60.57 | 60.57 |
...
RoBERTuito
|
||||||||||||
3 |
UH & Salesforce Research
UH & Salesforce Research
|
Char2subword mBERT | 59.07 | 59.07 |
...
Char2subword mBERT
|
||||||||||||
4 |
LinCE Organizers
LinCE Organizers
|
BERT base, cased | 58.40 | 58.40 |
...
BERT base, cased
|
||||||||||||
5 |
Remy
Remy
|
Fine-tuned M-BERT | 57.54 | 57.54 |
...
Fine-tuned M-BERT
|
||||||||||||
6 |
IBM
IBM
|
mBERT | 57.27 | 57.27 |
...
mBERT
|
||||||||||||
7 |
GeoReactor
GeoReactor
|
Spanish-based BERT | 56.47 | 56.47 |
...
Spanish-based BERT
|
||||||||||||
8 |
LinCE Organizers
LinCE Organizers
|
mBERT | 56.43 | 56.43 |
...
mBERT
|
||||||||||||
9 |
LinCE Organizers
LinCE Organizers
|
ELMo small | 52.88 | 52.88 |
...
ELMo small
|
Rank | Team | Model | Avg | ENG-HINGLISH | SPANGLISH-ENG | ENG-SPANGLISH | MSAEA-ENG | ENG-MSAEA | More | ||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
- |
UBC_HImt
UBC_HImt
|
mT5 | - | 12.67 | - | - | - | - |
...
mT5
|
||||||||||||||||||||
- |
IITP-MT
IITP-MT
|
synthetic-code-mixed | - | 10.09 | - | - | - | - |
...
synthetic-code-mixed
|
||||||||||||||||||||
- |
UBC_ARmt
UBC_ARmt
|
Big Transformer ZS | - | - | - | - | 21.34 | - |
...
Big Transformer ZS
|
||||||||||||||||||||
- |
UBC_ARmt
UBC_ARmt
|
mT5 | - | - | - | - | 16.41 | - |
...
mT5
|
||||||||||||||||||||
- |
UBC_ARmt
UBC_ARmt
|
mT5 TC | - | - | - | - | 18.80 | - |
...
mT5 TC
|
||||||||||||||||||||
- |
UBC_ARmt
UBC_ARmt
|
Big Transformer FT | - | - | - | - | 22.51 | - |
...
Big Transformer FT
|
||||||||||||||||||||
- |
UBC_ARmt
UBC_ARmt
|
Big Transformer FT TC | - | - | - | - | 25.72 | - |
...
Big Transformer FT TC
|
||||||||||||||||||||
- |
CMMTOne
CMMTOne
|
Gated Sequence to Sequence convolutions | - | 2.58 | - | - | - | - |
...
Gated Sequence to Sequence convolutions
|
||||||||||||||||||||
- |
UBC_ARmt
UBC_ARmt
|
mBART | - | - | - | - | 19.79 | - |
...
mBART
|
||||||||||||||||||||
- |
LTRC-PreCog
LTRC-PreCog
|
mBART-en | - | 12.22 | - | - | - | - |
...
mBART-en
|
||||||||||||||||||||
- |
LTRC-PreCog
LTRC-PreCog
|
mBART-hien | - | 11.86 | - | - | - | - |
...
mBART-hien
|
||||||||||||||||||||
- |
IBM
IBM
|
mBART Test | - | 11.60 | - | - | - | - |
...
mBART Test
|
||||||||||||||||||||
- |
B2BT EMNLP 2022 Findings
B2BT EMNLP 2022 Findings
|
mBART Multilingual | - | - | 49.29 | - | - | - |
...
mBART Multilingual
|
||||||||||||||||||||
- |
B2BT EMNLP 2022 Findings
B2BT EMNLP 2022 Findings
|
mBART Multilingual B2BT | - | - | 50.37 | - | - | - |
...
mBART Multilingual B2BT
|
||||||||||||||||||||
- |
B2BT EMNLP 2022 Findings
B2BT EMNLP 2022 Findings
|
mBART Multilingual + E → S BT | - | - | 50.03 | - | - | - |
...
mBART Multilingual + E → S BT
|
||||||||||||||||||||
- |
Amazon IML
Amazon IML
|
Comix | - | 12.98 | - | - | - | - |
...
Comix
|
||||||||||||||||||||
- |
Amazon IML
Amazon IML
|
Human | - | 10.43 | - | - | - | - |
...
Human
|
||||||||||||||||||||
- |
TUG-MT
TUG-MT
|
t5-small_6_3 LinCE BackTranslation | - | 4.97 | - | - | - | - |
...
t5-small_6_3 LinCE BackTranslation
|
||||||||||||||||||||
- |
TUG-MT
TUG-MT
|
t5-small_6_3 LinCE | - | 4.88 | - | - | - | - |
...
t5-small_6_3 LinCE
|
||||||||||||||||||||
- |
TUG-MT
TUG-MT
|
t5-small_6_3 cmu+LinCE | - | 5.47 | - | - | - | - |
...
t5-small_6_3 cmu+LinCE
|
||||||||||||||||||||
- |
TUG-MT
TUG-MT
|
t5-small_6_3 cmu+LinCE BackTranslate | - | 4.71 | - | - | - | - |
...
t5-small_6_3 cmu+LinCE BackTranslate
|