Command Palette
Search for a command to run...
Machine Translation
Machine translation is an important task in natural language processing, aiming to convert sentences from a source language into equivalent expressions in a target language. In recent years, neural network models based on the encoder-decoder attention mechanism, such as BERT, have made significant progress, greatly improving translation quality. Common evaluation metrics include BLEU, METEOR, and NIST, while the WMT series datasets are important resources widely used for benchmark testing.
WMT2014 English-German
Transformer Cycle (Rev)
WMT2014 English-French
Transformer+BT (ADMIN init)
IWSLT2014 German-English
PiNMT
WMT2016 Romanian-English
MLM pretraining
ACES
HWTSC-Teacher-Sim
WMT2016 English-Romanian
DeLighT
WMT2014 German-English
Bi-SimCut
IWSLT2015 German-English
PS-KD
WMT2016 English-German
IWSLT2015 English-Vietnamese
EnViT5 + MTet
WMT2016 German-English
IWSLT2015 English-German
PS-KD
IWSLT2014 English-German
WMT2015 English-German
ByteNet
FLoRes-200
GenTranslate-7B
WMT2016 English-Russian
Attentional encoder-decoder + BPE
WMT 2017 Latvian-English
FRMT (Portuguese - Portugal)
FRMT (Chinese - Taiwan)
WMT 2017 English-Chinese
DynamicConv
FRMT (Chinese - Mainland)
WMT2017 Chinese-English
StrokeNet
flores95-devtest eng-X
SeamlessM4T Large
WMT2014 French-English
flores95-devtest X-eng
FRMT (Portuguese - Brazil)
WMT 2018 Finnish-English
Arba Sicula
Larger
20NEWS
tensorflow/tensor2tensor
WMT2017 Turkish-English
IWSLT2017 German-English
Adaptively Sparse Transformer (alpha-entmax)
IWSLT2017 English-French
Transformer base + BPE-Dropout
WMT2014 English-Czech
Evolved Transformer Big
IWSLT2017 French-English
NLLB-200
IWSLT2017 Arabic-English
IWSLT2017 English-Arabic
Transformer base + BPE-Dropout
Itihasa
WMT2019 English-German
Facebook FAIR (ensemble)
IWSLT2015 Vietnamese-English
WMT 2022 Chinese-English
Vega-MT
WMT2017 Russian-English
OmniNetP
WMT2016 English-French
DeLighT
WMT 2022 German-English
WMT 2022 Japanese-English
IWSLT2015 Thai-English
Seq-KD + Seq-Inter + Word-KD
WMT2015 English-Russian
C2-50k Segmentation
slone/myv_ru_2022 ru-myv
WMT2016 Czech-English
Attentional encoder-decoder + BPE
WMT2016 Russian-English
WMT 2022 English-Czech
slone/myv_ru_2022 myv-ru
slone/mbart-large-51-myv-mul-v1
WMT 2022 English-Chinese
 ACCURAT balanced test corpus for under resourced languages Russian-Estonian
Multilingual Transformer
WMT2017 English-French
OmniNetP
WMT2017 English-Finnish
OmniNetP
Tatoeba (EL-to-EN)
V_B (trained on T_H)
WMT 2018 Estonian-English
Multi-pass backtranslated adapted transformer
WMT2016 English-Czech
WMT 2018 English-Estonian
Multi-pass backtranslated adapted transformer
Tatoeba (EN-to-EL)
PENELOPIE Transformers-based NMT (EN2EL)
Multi Lingual Bug Reports
ChatGPT
WMT 2017 English-Latvian
WMT 2022 Russian-English
IWSLT2015 Chinese-English
BP-Transformer
ACCURAT balanced test corpus for under resourced languages Estonian-Russian
WMT2016 Finnish-English
V_C (trained on T_H)
Business Scene Dialogue EN-JA
WMT2019 English-Japanese
WMT2017 English-German
OmniNetP
IWSLT 2017
GPT-4o (HPT)
WMT2017 Finnish-English
WMT 2018 English-Finnish
Transformer trained on highly filtered data
WMT 2022 English-German
WMT 2022 English-Japanese
WMT2019 German-English
Alexa Point of View
T5
WMT 2022 Czech-English
WMT2019 Finnish-English
WMT 2022 English-Russian
Vega-MT
Business Scene Dialogue JA-EN
Transformer-base
V_A (trained on T_H)
M_C