Relational Transformer
This repository contains the official checkpoints for the Relational Transformer (RT), introduced in the paper Relational Transformer: Toward Zero-Shot Foundation Models for Relational Data.
Relational Transformer is a foundation model architecture designed to be pretrained on diverse relational databases and applied to unseen datasets and tasks without task- or dataset-specific fine-tuning. It utilizes a novel Relational Attention mechanism over columns, rows, and primary-foreign key links.
- Paper: Relational Transformer: Toward Zero-Shot Foundation Models for Relational Data
- GitHub Repository: snap-stanford/relational-transformer
Installation
The repository uses pixi for package management.
git clone https://github.com/snap-stanford/relational-transformer
cd relational-transformer
pixi install
# compile and install the rust sampler
cd rustler
pixi run maturin develop --uv --release
Checkpoints
The project provides two types of checkpoints:
pretrain_<dataset>_<task>.pt: Pretrained with the specified<dataset>held out.contd-pretrain_<dataset>_<task>.pt: Obtained by continued pretraining on<dataset>with the specific<task>held out.
You can download specific checkpoints using the Hugging Face CLI:
mkdir -p ~/scratch/rt_ckpts
huggingface-cli download rishabh-ranjan/relational-transformer \
--repo-type model \
--include "pretrain_rel-amazon_user-churn.pt" \
--local-dir ~/scratch/rt_ckpts \
--local-dir-use-symlinks False
Usage
To use these checkpoints, pass the path to the load_ckpt_path argument in the training scripts provided in the GitHub repository. For example, to run a finetuning experiment:
pixi run torchrun --standalone --nproc_per_node=8 scripts/example_finetune.py
Citation
@inproceedings{ranjan2025relationaltransformer,
title={{Relational Transformer:} Toward Zero-Shot Foundation Models for Relational Data},
author={Rishabh Ranjan and Valter Hudovernik and Mark Znidar and Charilaos Kanatsoulis and Roshan Upendra and Mahmoud Mohammadi and Joe Meyer and Tom Palczewski and Carlos Guestrin and Jure Leskovec},
booktitle={The Fourteenth International Conference on Learning Representations},
year={2026}
}