| BERT-Large-uncased-PruneOFA (90% unstruct sparse) | 83.35 | 90.2 | Prune Once for All: Sparse Pre-Trained Language Models | |
| BERT-Large-uncased-PruneOFA (90% unstruct sparse, QAT Int8) | 83.22 | 90.02 | Prune Once for All: Sparse Pre-Trained Language Models | |
| BERT-Base-uncased-PruneOFA (85% unstruct sparse) | 81.1 | 88.42 | Prune Once for All: Sparse Pre-Trained Language Models | |
| BERT-Base-uncased-PruneOFA (85% unstruct sparse, QAT Int8) | 80.84 | 88.24 | Prune Once for All: Sparse Pre-Trained Language Models | |
| BERT-Base-uncased-PruneOFA (90% unstruct sparse) | 79.83 | 87.25 | Prune Once for All: Sparse Pre-Trained Language Models | |