HyperAIHyperAI

Command Palette

Search for a command to run...

CoMER: Modeling Coverage for Transformer-based Handwritten Mathematical Expression Recognition

Wenqi Zhao Liangcai Gao

Abstract

The Transformer-based encoder-decoder architecture has recently madesignificant advances in recognizing handwritten mathematical expressions.However, the transformer model still suffers from the lack of coverage problem,making its expression recognition rate (ExpRate) inferior to its RNNcounterpart. Coverage information, which records the alignment information ofthe past steps, has proven effective in the RNN models. In this paper, wepropose CoMER, a model that adopts the coverage information in the transformerdecoder. Specifically, we propose a novel Attention Refinement Module (ARM) torefine the attention weights with past alignment information without hurtingits parallelism. Furthermore, we take coverage information to the extreme byproposing self-coverage and cross-coverage, which utilize the past alignmentinformation from the current and previous layers. Experiments show that CoMERimproves the ExpRate by 0.61%/2.09%/1.59% compared to the currentstate-of-the-art model, and reaches 59.33%/59.81%/62.97% on the CROHME2014/2016/2019 test sets.


Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing

HyperAI Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
CoMER: Modeling Coverage for Transformer-based Handwritten Mathematical Expression Recognition | Papers | HyperAI