HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

SignGT: Signed Attention-based Graph Transformer for Graph Representation Learning

Jinsong Chen Gaichao Li John E. Hopcroft Kun He

SignGT: Signed Attention-based Graph Transformer for Graph Representation Learning

Abstract

The emerging graph Transformers have achieved impressive performance for graph representation learning over graph neural networks (GNNs). In this work, we regard the self-attention mechanism, the core module of graph Transformers, as a two-step aggregation operation on a fully connected graph. Due to the property of generating positive attention values, the self-attention mechanism is equal to conducting a smooth operation on all nodes, preserving the low-frequency information. However, only capturing the low-frequency information is inefficient in learning complex relations of nodes on diverse graphs, such as heterophily graphs where the high-frequency information is crucial. To this end, we propose a Signed Attention-based Graph Transformer (SignGT) to adaptively capture various frequency information from the graphs. Specifically, SignGT develops a new signed self-attention mechanism (SignSA) that produces signed attention values according to the semantic relevance of node pairs. Hence, the diverse frequency information between different node pairs could be carefully preserved. Besides, SignGT proposes a structure-aware feed-forward network (SFFN) that introduces the neighborhood bias to preserve the local topology information. In this way, SignGT could learn informative node representations from both long-range dependencies and local topology information. Extensive empirical results on both node-level and graph-level tasks indicate the superiority of SignGT against state-of-the-art graph Transformers as well as advanced GNNs.

Benchmarks

BenchmarkMethodologyMetrics
node-classification-on-actorSignGT
Accuracy: 38.65±0.32
node-classification-on-chameleonSignGT
Accuracy: 74.31±1.24
node-classification-on-squirrelSignGT-

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
SignGT: Signed Attention-based Graph Transformer for Graph Representation Learning | Papers | HyperAI