HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks

Kazuma Hashimoto; Caiming Xiong; Yoshimasa Tsuruoka; Richard Socher

A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks

Abstract

Transfer and multi-task learning have traditionally focused on either a single source-target pair or very few, similar tasks. Ideally, the linguistic levels of morphology, syntax and semantics would benefit each other by being trained in a single model. We introduce a joint many-task model together with a strategy for successively growing its depth to solve increasingly complex tasks. Higher layers include shortcut connections to lower-level task predictions to reflect linguistic hierarchies. We use a simple regularization term to allow for optimizing all model weights to improve one task's loss without exhibiting catastrophic interference of the other tasks. Our single end-to-end model obtains state-of-the-art or competitive results on five different tasks from tagging, parsing, relatedness, and entailment tasks.

Code Repositories

hassyGo/charNgram2vec
Official
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
chunking-on-penn-treebankJMT
F1 score: 95.77

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
A Joint Many-Task Model: Growing a Neural Network for Multiple NLP Tasks | Papers | HyperAI