Question Answering On Quasart T
评估指标
EM
评测结果
各个模型在此基准测试上的表现结果
| Paper Title | Repository | ||
|---|---|---|---|
| Cluster-Former (#C=512) | 54 | Cluster-Former: Clustering-based Sparse Transformer for Long-Range Dependency Encoding | - |
| Locality-Sensitive Hashing | 53.2 | Reformer: The Efficient Transformer | |
| Sparse Attention | 52.1 | Generating Long Sequences with Sparse Transformers | |
| Multi-passage BERT | 51.1 | Multi-passage BERT: A Globally Normalized BERT Model for Open-domain Question Answering | - |
| Denoising QA | 42.2 | Denoising Distantly Supervised Open-Domain Question Answering | - |
| DECAPROP | 38.6 | Densely Connected Attention Propagation for Reading Comprehension | |
| DrQA | 37.7 | Reading Wikipedia to Answer Open-Domain Questions |
0 of 7 row(s) selected.