Command Palette
Search for a command to run...
Kim Bum Jun ; Kim Sang Woo

Abstract
Regularization of deep neural networks has been an important issue to achievehigher generalization performance without overfitting problems. Although thepopular method of Dropout provides a regularization effect, it causesinconsistent properties in the output, which may degrade the performance ofdeep neural networks. In this study, we propose a new module called stochasticaverage pooling, which incorporates Dropout-like stochasticity in pooling. Wedescribe the properties of stochastic subsampling and average pooling andleverage them to design a module without any inconsistency problem. Thestochastic average pooling achieves a regularization effect without anypotential performance degradation due to the inconsistency issue and can easilybe plugged into existing architectures of deep neural networks. Experimentsdemonstrate that replacing existing average pooling with stochastic averagepooling yields consistent improvements across a variety of tasks, datasets, andmodels.
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| fine-grained-image-classification-on-caltech | SE-ResNet-101 (SAP) | Top-1 Error Rate: 15.949% |
| fine-grained-image-classification-on-oxford-2 | SE-ResNet-101 (SAP) | Accuracy: 86.011 |
| image-classification-on-cifar-10 | ResNet-110 (SAP) | Percentage correct: 93.861 |
| image-classification-on-cifar-100 | ResNet-110 (SAP) | Percentage correct: 72.537 |
| image-classification-on-stanford-cars | SE-ResNet-101 (SAP) | Accuracy: 85.812 |
| object-detection-on-coco-2017 | DyHead (SAP) | AP: 42.1 AP50: 59.4 AP75: 45.9 |
| semantic-segmentation-on-isprs-potsdam | PSPNet (SAP) | Mean IoU: 74.3 Overall Accuracy: 88.56 |
| semantic-segmentation-on-isprs-vaihingen | UPerNet (SAP) | Category mIoU: 73.27 Overall Accuracy: 90.14 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.