HyperAIHyperAI

Command Palette

Search for a command to run...

3 months ago

rLLM: Relational Table Learning with LLMs

Weichen Li Xiaotong Huang Jianwu Zheng Zheng Wang Chaokun Wang Li Pan Jianhua Li

rLLM: Relational Table Learning with LLMs

Abstract

We introduce rLLM (relationLLM), a PyTorch library designed for Relational Table Learning (RTL) with Large Language Models (LLMs). The core idea is to decompose state-of-the-art Graph Neural Networks, LLMs, and Table Neural Networks into standardized modules, to enable the fast construction of novel RTL-type models in a simple "combine, align, and co-train" manner. To illustrate the usage of rLLM, we introduce a simple RTL method named \textbf{BRIDGE}. Additionally, we present three novel relational tabular datasets (TML1M, TLF2K, and TACM12K) by enhancing classic datasets. We hope rLLM can serve as a useful and easy-to-use development framework for RTL-related tasks. Our code is available at: https://github.com/rllm-project/rllm.

Code Repositories

rllm-project/rllm
pytorch
Mentioned in GitHub
rllm-project/rllm_datasets
Official
Mentioned in GitHub

Benchmarks

BenchmarkMethodologyMetrics
classification-on-tacm12kBRIDGE
Accuracy: 25.6
classification-on-tlf2kBRIDGE
Accuracy: 42.2
classification-on-tml1mBRIDGE
Accuracy: 36.2

Build AI with AI

From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.

AI Co-coding
Ready-to-use GPUs
Best Pricing
Get Started

Hyper Newsletters

Subscribe to our latest updates
We will deliver the latest updates of the week to your inbox at nine o'clock every Monday morning
Powered by MailChimp
rLLM: Relational Table Learning with LLMs | Papers | HyperAI