Quantization On Imagenet

评估指标

Top-1 Accuracy (%)

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
FQ-ViT (ViT-L)85.03FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer
FQ-ViT (ViT-B)83.31FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer
FQ-ViT (Swin-B)82.97FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer
FQ-ViT (Swin-S)82.71FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer
FQ-ViT (DeiT-B)81.20FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer
FQ-ViT (Swin-T)80.51FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer
FQ-ViT (DeiT-S)79.17FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer
Xception W8A878.972HPTQ: Hardware-Friendly Post Training Quantization
ADLIK-MO-ResNet50-W4A477.878Learned Step Size Quantization
ADLIK-MO-ResNet50-W3A477.34Learned Step Size Quantization
EfficientNet-B0 ReLU W8A877.092HPTQ: Hardware-Friendly Post Training Quantization
ResNet50-W4A4 (paper)76.7Learned Step Size Quantization
EfficientNet-B0-W8A876.4HMQ: Hardware Friendly Mixed Precision Quantization Block for CNNs
EfficientNet-B0-W4A476HMQ: Hardware Friendly Mixed Precision Quantization Block for CNNs
ResNet50-W3A475.45HMQ: Hardware Friendly Mixed Precision Quantization Block for CNNs
EfficientNet-B0 W8A874.216HPTQ: Hardware-Friendly Post Training Quantization
MPT (80) +BN74.03Multi-Prize Lottery Ticket Hypothesis: Finding Accurate Binary Neural Networks by Pruning A Randomly Weighted Network
EfficientNet-W4A473.8LSQ+: Improving low-bit quantization through learnable offsets and better initialization
DenseNet-121 W8A873.356HPTQ: Hardware-Friendly Post Training Quantization
MixNet-W4A471.7LSQ+: Improving low-bit quantization through learnable offsets and better initialization
0 of 27 row(s) selected.