Natural Language Inference On Qnli

评估指标

Accuracy

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
ALICE99.2%SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization
ALBERT99.2%ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
StructBERTRoBERTa ensemble99.2%StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding-
MT-DNN-SMART99.2%SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization
RoBERTa (ensemble)98.9%RoBERTa: A Robustly Optimized BERT Pretraining Approach
T5-11B96.7%Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
T5-3B96.3%Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
DeBERTaV3large96%DeBERTaV3: Improving DeBERTa using ELECTRA-Style Pre-Training with Gradient-Disentangled Embedding Sharing
ELECTRA95.4%--
DeBERTa (large)95.3%DeBERTa: Decoding-enhanced BERT with Disentangled Attention
XLNet (single model)94.9%XLNet: Generalized Autoregressive Pretraining for Language Understanding
T5-Large 770M94.8%Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
RoBERTa-large 355M (MLP quantized vector-wise, fine-tuned)94.7%LLM.int8(): 8-bit Matrix Multiplication for Transformers at Scale
ERNIE 2.0 Large94.6%ERNIE 2.0: A Continual Pre-training Framework for Language Understanding
PSQ (Chen et al., 2020)94.5A Statistical Framework for Low-bitwidth Training of Deep Neural Networks
RoBERTa-large 355M + Entailment as Few-shot Learner94.5%Entailment as Few-Shot Learner
SpanBERT94.3%SpanBERT: Improving Pre-training by Representing and Predicting Spans
TRANS-BLSTM94.08%TRANS-BLSTM: Transformer with Bidirectional LSTM for Language Understanding-
T5-Base93.7%Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
ASA + RoBERTa93.6%Adversarial Self-Attention for Language Understanding
0 of 43 row(s) selected.
Natural Language Inference On Qnli | SOTA | HyperAI超神经