HyperAI
HyperAI超神经
首页
算力平台
文档
资讯
论文
教程
数据集
百科
SOTA
LLM 模型天梯
GPU 天梯
顶会
开源项目
全站搜索
关于
中文
HyperAI
HyperAI超神经
Toggle sidebar
全站搜索…
⌘
K
全站搜索…
⌘
K
首页
SOTA
图分类
Graph Classification On Imdb B
Graph Classification On Imdb B
评估指标
Accuracy
评测结果
各个模型在此基准测试上的表现结果
Columns
模型名称
Accuracy
Paper Title
Repository
U2GNN (Unsupervised)
96.41%
Universal Graph Transformer Self-Attention Networks
ESA (Edge set attention, no positional encodings)
86.250±0.957
An end-to-end attention-based approach for learning on graphs
GAT
84.250±2.062
Graph Attention Networks
MEWISPool
82.13%
Maximum Entropy Weighted Independent Set Pooling for Graph Neural Networks
GIN
81.250±3.775
How Powerful are Graph Neural Networks?
TokenGT
80.250±3.304
Pure Transformers are Powerful Graph Learners
GATv2
80.000±2.739
How Attentive are Graph Attention Networks?
G_ResNet
79.90%
When Work Matters: Transforming Classical Network Structures to Graph CNN
-
GCN
79.500±3.109
Semi-Supervised Classification with Graph Convolutional Networks
GraphGPS
79.250±3.096
Recipe for a General, Powerful, Scalable Graph Transformer
DUGNN
78.70%
Learning Universal Graph Neural Network Embeddings With Aid Of Transfer Learning
TFGW ADJ (L=2)
78.3%
Template based Graph Neural Network with Optimal Transport Distances
PNA
78.000±3.808
Principal Neighbourhood Aggregation for Graph Nets
sGIN
77.94%
Mutual Information Maximization in Graph Neural Networks
Graphormer
77.500±2.646
Do Transformers Really Perform Bad for Graph Representation?
SEG-BERT
77.2%
Segmented Graph-Bert for Graph Instance Modeling
U2GNN
77.04%
Universal Graph Transformer Self-Attention Networks
PIN
76.6%
Weisfeiler and Lehman Go Paths: Learning Topological Features via Path Complexes
-
GIUNet
76%
Graph isomorphism UNet
-
DropGIN
75.7%
DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks
0 of 50 row(s) selected.
Previous
Next
Graph Classification On Imdb B | SOTA | HyperAI超神经