HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Fix Bugs with Transformer through a Neural-Symbolic Edit Grammar

Hu Yaojie ; Shi Xingjian ; Zhou Qiang ; Pike Lee

Fix Bugs with Transformer through a Neural-Symbolic Edit Grammar

Abstract

We introduce NSEdit (neural-symbolic edit), a novel Transformer-based coderepair method. Given only the source code that contains bugs, NSEdit predictsan editing sequence that can fix the bugs. The edit grammar is formulated as aregular language, and the Transformer uses it as a neural-symbolic scriptinginterface to generate editing programs. We modify the Transformer and add apointer network to select the edit locations. An ensemble of rerankers aretrained to re-rank the editing sequences generated by beam search. We fine-tunethe rerankers on the validation set to reduce over-fitting. NSEdit is evaluatedon various code repair datasets and achieved a new state-of-the-art accuracy($24.04\%$) on the Tufano small dataset of the CodeXGLUE benchmark. NSEditperforms robustly when programs vary from packages to packages and when buggyprograms are concrete. We conduct detailed analysis on our methods anddemonstrate the effectiveness of each component.

Benchmarks

BenchmarkMethodologyMetrics
code-repair-on-codexglue-bugs2fixNSEdit
Accuracy (medium): 13.87
Accuracy (small): 24.04

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Fix Bugs with Transformer through a Neural-Symbolic Edit Grammar | Papers | HyperAI