HyperAIHyperAI

Command Palette

Search for a command to run...

5 months ago

Graph Propagation Transformer for Graph Representation Learning

Zhe Chen; Hao Tan; Tao Wang; Tianrun Shen; Tong Lu; Qiuying Peng; Cheng Cheng; Yue Qi

Graph Propagation Transformer for Graph Representation Learning

Abstract

This paper presents a novel transformer architecture for graph representation learning. The core insight of our method is to fully consider the information propagation among nodes and edges in a graph when building the attention module in the transformer blocks. Specifically, we propose a new attention mechanism called Graph Propagation Attention (GPA). It explicitly passes the information among nodes and edges in three ways, i.e. node-to-node, node-to-edge, and edge-to-node, which is essential for learning graph-structured data. On this basis, we design an effective transformer architecture named Graph Propagation Transformer (GPTrans) to further help learn graph data. We verify the performance of GPTrans in a wide range of graph learning experiments on several benchmark datasets. These results show that our method outperforms many state-of-the-art transformer-based graph models with better performance. The code will be released at https://github.com/czczup/GPTrans.

Code Repositories

czczup/gptrans
Official
pytorch
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
graph-property-prediction-on-ogbg-molhivGPTrans-B
Ext. data: Yes
Test ROC-AUC: 0.8126 ± 0.0032
graph-regression-on-pcqm4m-lscGPTrans-L
Validation MAE: 0.1151
graph-regression-on-pcqm4mv2-lscGPTrans-L
Test MAE: 0.0821
Validation MAE: 0.0809
graph-regression-on-pcqm4mv2-lscGPTrans-T
Test MAE: 0.0842
Validation MAE: 0.0833
graph-regression-on-zinc-500kGPTrans-Nano
MAE: 0.077
node-classification-on-clusterGPTrans-Nano
Accuracy: 78.07
node-classification-on-patternGPTrans-Nano
Accuracy: 86.734±0.008

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Graph Propagation Transformer for Graph Representation Learning | Papers | HyperAI