Unsupervised Machine Translation On Wmt2016 5

评估指标

BLEU

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
BERT-fused NMT36.02Incorporating BERT into Neural Machine Translation
MLM pretraining for encoder and decoder33.3Cross-lingual Language Model Pretraining
0 of 2 row(s) selected.
Unsupervised Machine Translation On Wmt2016 5 | SOTA | HyperAI超神经