HyperAIHyperAI

Command Palette

Search for a command to run...

4 months ago

Representation Learning on Graphs with Jumping Knowledge Networks

Keyulu Xu; Chengtao Li; Yonglong Tian; Tomohiro Sonobe; Ken-ichi Kawarabayashi; Stefanie Jegelka

Representation Learning on Graphs with Jumping Knowledge Networks

Abstract

Recent deep learning approaches for representation learning on graphs follow a neighborhood aggregation procedure. We analyze some important properties of these models, and propose a strategy to overcome those. In particular, the range of "neighboring" nodes that a node's representation draws from strongly depends on the graph structure, analogous to the spread of a random walk. To adapt to local neighborhood properties and tasks, we explore an architecture -- jumping knowledge (JK) networks -- that flexibly leverages, for each node, different neighborhood ranges to enable better structure-aware representation. In a number of experiments on social, bioinformatics and citation networks, we demonstrate that our model achieves state-of-the-art performance. Furthermore, combining the JK framework with models like Graph Convolutional Networks, GraphSAGE and Graph Attention Networks consistently improves those models' performance.

Code Repositories

Benchmarks

BenchmarkMethodologyMetrics
node-classification-on-ppiJK-LSTM
F1: 97.6

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
Representation Learning on Graphs with Jumping Knowledge Networks | Papers | HyperAI