Command Palette
Search for a command to run...
Graph2Seq: Graph to Sequence Learning with Attention-based Neural Networks
Xu Kun ; Wu Lingfei ; Wang Zhiguo ; Feng Yansong ; Witbrock Michael ; Sheinin Vadim

Abstract
The celebrated Sequence to Sequence learning (Seq2Seq) technique and itsnumerous variants achieve excellent performance on many tasks. However, manymachine learning tasks have inputs naturally represented as graphs; existingSeq2Seq models face a significant challenge in achieving accurate conversionfrom graph form to the appropriate sequence. To address this challenge, weintroduce a novel general end-to-end graph-to-sequence neural encoder-decodermodel that maps an input graph to a sequence of vectors and uses anattention-based LSTM method to decode the target sequence from these vectors.Our method first generates the node and graph embeddings using an improvedgraph-based neural network with a novel aggregation strategy to incorporateedge direction information in the node embeddings. We further introduce anattention mechanism that aligns node embeddings and the decoding sequence tobetter cope with large graphs. Experimental results on bAbI, Shortest Path, andNatural Language Generation tasks demonstrate that our model achievesstate-of-the-art performance and significantly outperforms existing graphneural networks, Seq2Seq, and Tree2Seq models; using the proposedbi-directional node embedding aggregation strategy, the model can convergerapidly to the optimal performance.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| sql-to-text-on-wikisql | Graph2Seq-PGE | BLEU-4: 38.97 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.