Graph Regression On Pcqm4Mv2 Lsc

评估指标

Test MAE
Validation MAE

评测结果

各个模型在此基准测试上的表现结果

Paper TitleRepository
MLP-Fingerprint0.17600.1753OGB-LSC: A Large-Scale Challenge for Machine Learning on Graphs
GCN0.13980.1379Semi-Supervised Classification with Graph Convolutional Networks
GIN0.12180.1195How Powerful are Graph Neural Networks?
TokenGT0.09190.0910Pure Transformers are Powerful Graph Learners
EGT+SSA-0.0876The Information Pathways Hypothesis: Transformers are Dynamic Self-Ensembles
GRPE-Large0.08760.0867GRPE: Relative Positional Encoding for Graph Transformer
EGT+SSA+Self-ensemble-0.0865The Information Pathways Hypothesis: Transformers are Dynamic Self-Ensembles
Graphormer-0.0864Do Transformers Really Perform Bad for Graph Representation?
Graphormer + GFSA-0.0860Graph Convolutions Enrich the Self-Attention in Transformers!
GRIT-0.0859Graph Inductive Biases in Transformers without Message Passing
EGT0.08620.0857Global Self-Attention as a Replacement for Graph Convolution
GPS0.08620.0852Recipe for a General, Powerful, Scalable Graph Transformer
GPTrans-T0.08420.0833Graph Propagation Transformer for Graph Representation Learning
TIGT-0.0826Topology-Informed Graph Transformer
GPTrans-L0.08210.0809Graph Propagation Transformer for Graph Representation Learning
Transformer-M0.07820.0772One Transformer Can Understand Both 2D & 3D Molecular Data
Uni-Mol+0.07050.0693Highly Accurate Quantum Chemical Property Prediction with Uni-Mol+
EGT + Triangular Attention0.06830.0671Global Self-Attention as a Replacement for Graph Convolution
TGT-At0.06830.0671Triplet Interaction Improves Graph Transformers: Accurate Molecular Graph Learning with Triplet Graph Transformers
ESA (Edge set attention, no positional encodings)N/A0.0235An end-to-end attention-based approach for learning on graphs
0 of 20 row(s) selected.
Graph Regression On Pcqm4Mv2 Lsc | SOTA | HyperAI超神经