Natural Language Inference On Snli

评估指标

% Test Accuracy
% Train Accuracy
Parameters

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
UnitedSynT5 (3B)94.7--First Train to Generate, then Generate to Train: UnitedSynT5 for Few-Shot NLI-
UnitedSynT5 (335M)93.5--First Train to Generate, then Generate to Train: UnitedSynT5 for Few-Shot NLI-
Neural Tree Indexers for Text Understanding93.1-355Entailment as Few-Shot Learner
EFL (Entailment as Few-shot Learner) + RoBERTa-large93.1?355mEntailment as Few-Shot Learner
RoBERTa-large + self-explaining layer92.3?355m+Self-Explaining Structures Improve NLP Models
RoBERTa-large+Self-Explaining92.3-340Self-Explaining Structures Improve NLP Models
CA-MTL92.192.6340mConditionally Adaptive Multi-Task Learning: Improving Transfer Learning in NLP Using Fewer Parameters & Less Data
SemBERT91.994.4339mSemantics-aware BERT for Language Understanding
MT-DNN-SMARTLARGEv091.7--SMART: Robust and Efficient Fine-Tuning for Pre-trained Natural Language Models through Principled Regularized Optimization
MT-DNN91.697.2330mMulti-Task Deep Neural Networks for Natural Language Understanding
SJRC (BERT-Large +SRL)91.395.7308mExplicit Contextual Semantics for Text Comprehension-
Ntumpha90.599.1220Multi-Task Deep Neural Networks for Natural Language Understanding
Densely-Connected Recurrent and Co-Attentive Network Ensemble90.195.053.3mSemantic Sentence Matching with Densely-connected Recurrent and Co-attentive Information-
MFAE90.0793.18-What Do Questions Exactly Ask? MFAE: Duplicate Question Identification with Multi-Fusion Asking Emphasis-
Fine-Tuned LM-Pretrained Transformer89.996.685mImproving Language Understanding by Generative Pre-Training-
300D DMAN Ensemble89.696.179m--
300D DMAN Ensemble89.696.179mDiscourse Marker Augmented Network with Reinforcement Learning for Natural Language Inference
150D Multiway Attention Network Ensemble89.495.558mMultiway Attention Networks for Modeling Sentence Pairs-
ESIM + ELMo Ensemble89.392.140mDeep contextualized word representations
450D DR-BiLSTM Ensemble89.394.845mDR-BiLSTM: Dependent Reading Bidirectional LSTM for Natural Language Inference-
0 of 98 row(s) selected.
Natural Language Inference On Snli | SOTA | HyperAI超神经