Command Palette
Search for a command to run...
Imposing Relation Structure in Language-Model Embeddings Using Contrastive Learning
Christos Theodoropoulos James Henderson Andrei C. Coman Marie-Francine Moens

Abstract
Though language model text embeddings have revolutionized NLP research, their ability to capture high-level semantic information, such as relations between entities in text, is limited. In this paper, we propose a novel contrastive learning framework that trains sentence embeddings to encode the relations in a graph structure. Given a sentence (unstructured text) and its graph, we use contrastive learning to impose relation-related structure on the token-level representations of the sentence obtained with a CharacterBERT (El Boukkouri et al.,2020) model. The resulting relation-aware sentence embeddings achieve state-of-the-art results on the relation extraction task using only a simple KNN classifier, thereby demonstrating the success of the proposed method. Additional visualization by a tSNE analysis shows the effectiveness of the learned representation space compared to baselines. Furthermore, we show that we can learn a different space for named entity recognition, again using a contrastive learning objective, and demonstrate how to successfully combine both representation spaces in an entity-relation task.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| relation-extraction-on-ade-corpus | CLDR + CLNER | NER Macro F1: 88.3 RE+ Macro F1: 79.97 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.