HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Improving Attention Mechanism in Graph Neural Networks via Cardinality Preservation

Shuo Zhang; Lei Xie

Improving Attention Mechanism in Graph Neural Networks via Cardinality Preservation

Abstract

Graph Neural Networks (GNNs) are powerful to learn the representation of graph-structured data. Most of the GNNs use the message-passing scheme, where the embedding of a node is iteratively updated by aggregating the information of its neighbors. To achieve a better expressive capability of node influences, attention mechanism has grown to be popular to assign trainable weights to the nodes in aggregation. Though the attention-based GNNs have achieved remarkable results in various tasks, a clear understanding of their discriminative capacities is missing. In this work, we present a theoretical analysis of the representational properties of the GNN that adopts the attention mechanism as an aggregator. Our analysis determines all cases when those attention-based GNNs can always fail to distinguish certain distinct structures. Those cases appear due to the ignorance of cardinality information in attention-based aggregation. To improve the performance of attention-based GNNs, we propose cardinality preserved attention (CPA) models that can be applied to any kind of attention mechanisms. Our experiments on node and graph classification confirm our theoretical analysis and show the competitive performance of our CPA models.

Code Repositories

zetayue/CPA
Official
pytorch

Benchmarks

BenchmarkMethodologyMetrics
graph-classification-on-enzymesGAT-GC (f-Scaled)
Accuracy: 58.45
graph-classification-on-mutagGAT-GC (f-Scaled)
Accuracy: 90.44%
graph-classification-on-nci1GAT-GC (f-Scaled)
Accuracy: 82.28%
graph-classification-on-proteinsGAT-GC (f-Scaled)
Accuracy: 76.81%
graph-classification-on-re-m5kGAT-GC (f-Scaled)
Accuracy: 57.22%
graph-classification-on-reddit-bGAT-GC (f-Scaled)
Accuracy: 92.57

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Improving Attention Mechanism in Graph Neural Networks via Cardinality Preservation | Papers | HyperAI