HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Alleviating Sequence Information Loss with Data Overlapping and Prime Batch Sizes

Noémien Kocher Christian Scuito Lorenzo Tarantino Alexandros Lazaridis Andreas Fischer Claudiu Musat

Alleviating Sequence Information Loss with Data Overlapping and Prime Batch Sizes

Abstract

In sequence modeling tasks the token order matters, but this information can be partially lost due to the discretization of the sequence into data points. In this paper, we study the imbalance between the way certain token pairs are included in data points and others are not. We denote this a token order imbalance (TOI) and we link the partial sequence information loss to a diminished performance of the system as a whole, both in text and speech processing tasks. We then provide a mechanism to leverage the full token order information -Alleviated TOI- by iteratively overlapping the token composition of data points. For recurrent networks, we use prime numbers for the batch size to avoid redundancies when building batches from overlapped data points. The proposed method achieved state of the art performance in both text and speech related tasks.

Code Repositories

nkcr/overlap-ml
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
language-modelling-on-wikitext-103AWD-LSTM-MoS + ATOI
Test perplexity: 32.85
Validation perplexity: 31.92
language-modelling-on-wikitext-2AWD-LSTM + ATOI
Number of params: 33M
Test perplexity: 64.73
Validation perplexity: 67.47

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Alleviating Sequence Information Loss with Data Overlapping and Prime Batch Sizes | Papers | HyperAI