Question Answering On Wikiqa

评估指标

MAP
MRR

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
TANDA-DeBERTa-V3-Large + ALL0.9270.939Structural Self-Supervised Objectives for Transformers
RLAS-BIABC0.9240.908RLAS-BIABC: A Reinforcement Learning-Based Answer Selection Using the BERT Model Boosted by an Improved ABC Algorithm-
TANDA-RoBERTa (ASNQ, WikiQA)0.9200.933TANDA: Transfer and Adapt Pre-Trained Transformer Models for Answer Sentence Selection
DeBERTa-V3-Large + ALL0.9090.920Pre-training Transformer Models with Sentence-Level Objectives for Answer Sentence Selection-
DeBERTa-Large + SSP0.9010.914Pre-training Transformer Models with Sentence-Level Objectives for Answer Sentence Selection-
RoBERTa-Base + SSP0.8870.899Pre-training Transformer Models with Sentence-Level Objectives for Answer Sentence Selection-
RoBERTa-Base Joint MSPP0.8870.900Paragraph-based Transformer Pre-training for Multi-Sentence Inference
Comp-Clip + LM + LC0.7640.784A Compare-Aggregate Model with Latent Clustering for Answer Selection-
RE20.74520.7618Simple and Effective Text Matching with Richer Alignment Features
HyperQA0.7120.727Hyperbolic Representation Learning for Fast and Efficient Neural Question Answering
PWIM0.70900.7234--
Key-Value Memory Network0.70690.7265Key-Value Memory Networks for Directly Reading Documents
LDC0.70580.7226Sentence Similarity Learning by Lexical Decomposition and Composition
PairwiseRank + Multi-Perspective CNN0.70100.7180Noise Contrastive Estimation and Negative Sampling for Conditional Models: Consistency and Statistical Efficiency-
AP-CNN0.68860.6957Attentive Pooling Networks
Attentive LSTM0.68860.7069Neural Variational Inference for Text Processing
LSTM (lexical overlap + dist output)0.6820.6988Neural Variational Inference for Text Processing
MMA-NSE attention0.68110.6993Neural Semantic Encoders
SWEM-concat0.67880.6908Baseline Needs More Love: On Simple Word-Embedding-Based Models and Associated Pooling Mechanisms
LSTM0.65520.6747Neural Variational Inference for Text Processing
0 of 25 row(s) selected.
Question Answering On Wikiqa | SOTA | HyperAI超神经