HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

UniDrop: A Simple yet Effective Technique to Improve Transformer without Extra Cost

Zhen Wu Lijun Wu Qi Meng Yingce Xia Shufang Xie Tao Qin Xinyu Dai Tie-Yan Liu

UniDrop: A Simple yet Effective Technique to Improve Transformer without Extra Cost

Abstract

Transformer architecture achieves great success in abundant natural language processing tasks. The over-parameterization of the Transformer model has motivated plenty of works to alleviate its overfitting for superior performances. With some explorations, we find simple techniques such as dropout, can greatly boost model performance with a careful design. Therefore, in this paper, we integrate different dropout techniques into the training of Transformer models. Specifically, we propose an approach named UniDrop to unites three different dropout techniques from fine-grain to coarse-grain, i.e., feature dropout, structure dropout, and data dropout. Theoretically, we demonstrate that these three dropouts play different roles from regularization perspectives. Empirically, we conduct experiments on both neural machine translation and text classification benchmark datasets. Extensive results indicate that Transformer with UniDrop can achieve around 1.5 BLEU improvement on IWSLT14 translation tasks, and better accuracy for the classification even using strong pre-trained RoBERTa as backbone.

Benchmarks

BenchmarkMethodologyMetrics
machine-translation-on-iwslt2014-englishUnidrop
BLEU score: 29.99
machine-translation-on-iwslt2014-germanUniDrop
BLEU score: 36.88

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp