HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

Chemformer: a pre-trained transformer for computational chemistry

{Esben Jannik Bjerrum Jiazhen He Spyridon Dimitriadis Ross Irwin}

Abstract

Transformer models coupled with a simplified molecular line entry system (SMILES) have recently proven to be a powerful combination for solving challenges in cheminformatics. These models, however, are often developed specifically for a single application and can be very resource-intensive to train. In this work we present the Chemformer model—a Transformer-based model which can be quickly applied to both sequence-to-sequence and discriminative cheminformatics tasks. Additionally, we show that self-supervised pre-training can improve performance and significantly speed up convergence on downstream tasks. On direct synthesis and retrosynthesis prediction benchmark datasets we publish state-of-the-art results for top-1 accuracy. We also improve on existing approaches for a molecular optimisation task and show that Chemformer can optimise on multiple discriminative tasks simultaneously. Models, datasets and code will be made available after publication.

Benchmarks

BenchmarkMethodologyMetrics
single-step-retrosynthesis-on-uspto-50kChemformer-Large (reaction class unknown)
Top-1 accuracy: 54.3
Top-10 accuracy: 63.0
Top-5 accuracy: 62.3

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Chemformer: a pre-trained transformer for computational chemistry | Papers | HyperAI