HyperAIHyperAI

Command Palette

Search for a command to run...

Unsupervised Machine Translation On Wmt2016 2

Metrics

BLEU

Results

Performance results of various models on this benchmark

Paper TitleRepository
MASS (6-layer Transformer)35.2MASS: Masked Sequence to Sequence Pre-training for Language Generation
MLM pretraining for encoder and decoder33.3Cross-lingual Language Model Pretraining
GPT-3 175B (Few-Shot)21Language Models are Few-Shot Learners
0 of 3 row(s) selected.
Unsupervised Machine Translation On Wmt2016 2 | SOTA | HyperAI