HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Guidelines for the Regularization of Gammas in Batch Normalization for Deep Residual Networks

Bum Jun Kim Hyeyeon Choi Hyeonah Jang Dong Gu Lee Wonseok Jeong Sang Woo Kim

Guidelines for the Regularization of Gammas in Batch Normalization for Deep Residual Networks

Abstract

L2 regularization for weights in neural networks is widely used as a standard training trick. However, L2 regularization for gamma, a trainable parameter of batch normalization, remains an undiscussed mystery and is applied in different ways depending on the library and practitioner. In this paper, we study whether L2 regularization for gamma is valid. To explore this issue, we consider two approaches: 1) variance control to make the residual network behave like identity mapping and 2) stable optimization through the improvement of effective learning rate. Through two analyses, we specify the desirable and undesirable gamma to apply L2 regularization and propose four guidelines for managing them. In several experiments, we observed the increase and decrease in performance caused by applying L2 regularization to gamma of four categories, which is consistent with our four guidelines. Our proposed guidelines were validated through various tasks and architectures, including variants of residual networks and transformers.

Benchmarks

BenchmarkMethodologyMetrics
machine-translation-on-iwslt2014-germanTransformer
BLEU score: 35.1385
text-classification-on-glue-sst2BERT
Accuracy: 92.0872

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Guidelines for the Regularization of Gammas in Batch Normalization for Deep Residual Networks | Papers | HyperAI