HyperAIHyperAI

Command Palette

Search for a command to run...

CodeBERT: A Pre-Trained Model for Programming and Natural Languages

Abstract

We present CodeBERT, a bimodal pre-trained model for programming language(PL) and nat-ural language (NL). CodeBERT learns general-purposerepresentations that support downstream NL-PL applications such as naturallanguage codesearch, code documentation generation, etc. We develop CodeBERTwith Transformer-based neural architecture, and train it with a hybridobjective function that incorporates the pre-training task of replaced tokendetection, which is to detect plausible alternatives sampled from generators.This enables us to utilize both bimodal data of NL-PL pairs and unimodal data,where the former provides input tokens for model training while the latterhelps to learn better generators. We evaluate CodeBERT on two NL-PLapplications by fine-tuning model parameters. Results show that CodeBERTachieves state-of-the-art performance on both natural language code search andcode documentation generation tasks. Furthermore, to investigate what type ofknowledge is learned in CodeBERT, we construct a dataset for NL-PL probing, andevaluate in a zero-shot setting where parameters of pre-trained models arefixed. Results show that CodeBERT performs better than previous pre-trainedmodels on NL-PL probing.


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
CodeBERT: A Pre-Trained Model for Programming and Natural Languages | Papers | HyperAI