Transformers Meet Relational Databases
–arXiv.org Artificial Intelligence
Transformer models have continuously expanded into all machine learning domains convertible to the underlying sequence-to-sequence representation, including tabular data. However, while ubiquitous, this representation restricts their extension to the more general case of relational databases. In this paper, we introduce a modular neural message-passing scheme that closely adheres to the formal relational model, enabling direct end-to-end learning of tabular Transformers from database storage systems. We address the challenges of appropriate learning data representation and loading, which are critical in the database setting, and compare our approach against a number of representative models from various related fields across a significantly wide range of datasets. Our results demonstrate a superior performance of this newly proposed class of neural architectures.
arXiv.org Artificial Intelligence
Dec-6-2024
- Country:
- Asia > Middle East
- Jordan (0.04)
- Europe > Czechia
- Prague (0.04)
- North America > United States
- California > Santa Clara County
- Palo Alto (0.04)
- New York > New York County
- New York City (0.04)
- California > Santa Clara County
- Asia > Middle East
- Genre:
- Research Report > New Finding (0.68)
- Technology: