rLLM: Relational Table Learning with LLMs
Li, Weichen, Huang, Xiaotong, Zheng, Jianwu, Wang, Zheng, Wang, Chaokun, Pan, Li, Li, Jianhua
–arXiv.org Artificial Intelligence
We introduce rLLM (relationLLM), a PyTorch library designed for Relational Table Learning (RTL) with Large Language Models (LLMs). The core idea is to decompose state-of-the-art Graph Neural Networks, LLMs, and Table Neural Networks into standardized modules, to enable the fast construction of novel RTL-type models in a simple "combine, align, and co-train" manner. To illustrate the usage of rLLM, we introduce a simple RTL method named \textbf{BRIDGE}. Additionally, we present three novel relational tabular datasets (TML1M, TLF2K, and TACM12K) by enhancing classic datasets. We hope rLLM can serve as a useful and easy-to-use development framework for RTL-related tasks. Our code is available at: https://github.com/rllm-project/rllm.
arXiv.org Artificial Intelligence
Jul-29-2024
- Country:
- Asia > China
- North America > United States (0.04)
- Genre:
- Research Report (0.50)
- Industry:
- Leisure & Entertainment (0.48)
- Media (0.30)
- Technology: