HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Sicilian Translator: A Recipe for Low-Resource NMT

Eryk Wdowiak

Sicilian Translator: A Recipe for Low-Resource NMT

Abstract

With 17,000 pairs of Sicilian-English translated sentences, Arba Sicula developed the first neural machine translator for the Sicilian language. Using small subword vocabularies, we trained small Transformer models with high dropout parameters and achieved BLEU scores in the upper 20s. Then we supplemented our dataset with backtranslation and multilingual translation and pushed our scores into the mid 30s. We also attribute our success to incorporating theoretical information in our dataset. Prior to training, we biased the subword vocabulary towards the desinences one finds in a textbook. And we included textbook exercises in our dataset.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
machine-translation-on-arba-siculaMany-to-Many
BLEU (It-Scn): 36.5
BLEU (Scn-It): 30.9
machine-translation-on-arba-siculaLarger
BLEU (En-Scn): 35.0
BLEU (Scn-En): 36.8

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Sicilian Translator: A Recipe for Low-Resource NMT | Papers | HyperAI