HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

ELECTRAMed: a new pre-trained language representation model for biomedical NLP

Giacomo Miolo; Giulio Mantoan; Carlotta Orsenigo

ELECTRAMed: a new pre-trained language representation model for biomedical NLP

Abstract

The overwhelming amount of biomedical scientific texts calls for the development of effective language models able to tackle a wide range of biomedical natural language processing (NLP) tasks. The most recent dominant approaches are domain-specific models, initialized with general-domain textual data and then trained on a variety of scientific corpora. However, it has been observed that for specialized domains in which large corpora exist, training a model from scratch with just in-domain knowledge may yield better results. Moreover, the increasing focus on the compute costs for pre-training recently led to the design of more efficient architectures, such as ELECTRA. In this paper, we propose a pre-trained domain-specific language model, called ELECTRAMed, suited for the biomedical field. The novel approach inherits the learning framework of the general-domain ELECTRA architecture, as well as its computational advantages. Experiments performed on benchmark datasets for several biomedical NLP tasks support the usefulness of ELECTRAMed, which sets the novel state-of-the-art result on the BC5CDR corpus for named entity recognition, and provides the best outcome in 2 over the 5 runs of the 7th BioASQ-factoid Challange for the question answering task.

Benchmarks

BenchmarkMethodologyMetrics
drug-drug-interaction-extraction-on-ddiELECTRAMed
Micro F1: 79.13
named-entity-recognition-ner-on-bc5cdrELECTRAMed
F1: 90.03
named-entity-recognition-ner-on-ncbi-diseaseELECTRAMed
F1: 87.54
relation-extraction-on-chemprotELECTRAMed
F1: 72.94

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
ELECTRAMed: a new pre-trained language representation model for biomedical NLP | Papers | HyperAI