Machine Translation On Wmt2016 Romanian

评估指标

BLEU score

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
fast-noisy-channel-modeling40.3Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling
FLAN 137B (few-shot, k=9)38.1Finetuned Language Models Are Zero-Shot Learners
FLAN 137B (zero-shot)37.3Finetuned Language Models Are Zero-Shot Learners
MLM pretraining35.3Cross-lingual Language Model Pretraining
GenTranslate33.5GenTranslate: Large Language Models are Generative Multilingual Speech and Machine Translators
Attentional encoder-decoder + BPE33.3Edinburgh Neural Machine Translation Systems for WMT 16
Levenshtein Transformer (distillation)33.26Levenshtein Transformer
CMLM+LAT+4 iterations33.26Incorporating a Local Translation Mechanism into Non-autoregressive Translation
Adaptively Sparse Transformer (1.5-entmax)33.1Adaptively Sparse Transformers
HeadMask (Impt-18)32.95Alleviating the Inequality of Attention Heads for Neural Machine Translation-
FlowSeq-large (NPD n = 30)32.91FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow
Adaptively Sparse Transformer (alpha-entmax)32.89Adaptively Sparse Transformers
HeadMask (Random-18)32.85Alleviating the Inequality of Attention Heads for Neural Machine Translation-
FlowSeq-large (NPD n = 15)32.46FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow
FlowSeq-large (IWD n = 15)32.03FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow
NAT +FT + NPD31.44Non-Autoregressive Neural Machine Translation
CMLM+LAT+1 iterations31.24Incorporating a Local Translation Mechanism into Non-autoregressive Translation
FlowSeq-large30.69FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow
Denoising autoencoders (non-autoregressive)30.30Deterministic Non-Autoregressive Neural Sequence Modeling by Iterative Refinement
FlowSeq-base30.16FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow
0 of 21 row(s) selected.
Machine Translation On Wmt2016 Romanian | SOTA | HyperAI超神经