Language Modelling On Wiki 40B
评估指标
Perplexity
评测结果
各个模型在此基准测试上的表现结果
| Paper Title | Repository | ||
|---|---|---|---|
| Combiner-Fixed-8k | 16.60 | Combiner: Full Attention Transformer with Sparse Computation Cost | |
| Combiner-Axial-8k | 16.49 | Combiner: Full Attention Transformer with Sparse Computation Cost | |
| FLASH-Quad-8k | 14.998 | Transformer Quality in Linear Time |
0 of 3 row(s) selected.