Command Palette
Search for a command to run...
Handwritten Mathematical Expression Recognition with Bidirectionally Trained Transformer
Zhao Wenqi ; Gao Liangcai ; Yan Zuoyu ; Peng Shuai ; Du Lin ; Zhang Ziyin

Abstract
Encoder-decoder models have made great progress on handwritten mathematicalexpression recognition recently. However, it is still a challenge for existingmethods to assign attention to image features accurately. Moreover, thoseencoder-decoder models usually adopt RNN-based models in their decoder part,which makes them inefficient in processing long $\LaTeX{}$ sequences. In thispaper, a transformer-based decoder is employed to replace RNN-based ones, whichmakes the whole model architecture very concise. Furthermore, a novel trainingstrategy is introduced to fully exploit the potential of the transformer inbidirectional language modeling. Compared to several methods that do not usedata augmentation, experiments demonstrate that our model improves the ExpRateof current state-of-the-art methods on CROHME 2014 by 2.23%. Similarly, onCROHME 2016 and CROHME 2019, we improve the ExpRate by 1.92% and 2.28%respectively.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| handwritten-mathmatical-expression | BTTR | ExpRate: 53.96 |
| handwritten-mathmatical-expression-1 | BTTR | ExpRate: 52.31 |
| handwritten-mathmatical-expression-2 | BTTR | ExpRate: 52.96 |
| handwritten-mathmatical-expression-3 | BTTR | ExpRate: 64.1 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.