HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning

Sandeep Subramanian; Adam Trischler; Yoshua Bengio; Christopher J Pal

Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning

Abstract

A lot of the recent success in natural language processing (NLP) has been driven by distributed vector representations of words trained on large amounts of text in an unsupervised manner. These representations are typically used as general purpose features for words across a range of NLP problems. However, extending this success to learning representations of sequences of words, such as sentences, remains an open problem. Recent work has explored unsupervised as well as supervised learning techniques with different training objectives to learn general purpose fixed-length sentence representations. In this work, we present a simple, effective multi-task learning framework for sentence representations that combines the inductive biases of diverse training objectives in a single model. We train this model on several data sources with multiple training objectives on over 100 million sentences. Extensive experiments demonstrate that sharing a single recurrent sentence encoder across weakly related tasks leads to consistent improvements over previous methods. We present substantial improvements in the context of transfer learning and low-resource settings using our learned general-purpose representations.

Code Repositories

facebookresearch/InferSent
pytorch
Mentioned in GitHub
facebookresearch/SentEval
Official
pytorch
Mentioned in GitHub
Maluuba/gensen
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
natural-language-inference-on-multinliGenSen
Matched: 71.4
Mismatched: 71.3
paraphrase-identification-on-quora-questionGenSen
Accuracy: 87.01
semantic-textual-similarity-on-mrpcGenSen
Accuracy: 78.6%
F1: 84.4%
semantic-textual-similarity-on-sentevalGenSen
MRPC: 78.6/84.4
SICK-E: 87.8
SICK-R: 0.888
STS: 78.9/78.6

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning | Papers | HyperAI