Foundation Models for Causal Inference via Prior-Data Fitted Networks
Ma, Yuchen, Frauen, Dennis, Javurek, Emil, Feuerriegel, Stefan
–arXiv.org Artificial Intelligence
Prior-data fitted networks (PFNs) have recently been proposed as a promising way to train tabular foundation models. PFNs are transformers that are pre-trained on synthetic data generated from a prespecified prior distribution and that enable Bayesian inference through in-context learning. In this paper, we introduce CausalFM, a comprehensive framework for training PFN-based foundation models in various causal inference settings. First, we formalize the construction of Bayesian priors for causal inference based on structural causal models (SCMs) in a principled way and derive necessary criteria for the validity of such priors. Building on this, we propose a novel family of prior distributions using causality-inspired Bayesian neural networks that enable CausalFM to perform Bayesian causal inference in various settings, including for back-door, front-door, and instrumental variable adjustment. Finally, we instantiate CausalFM and explicitly train models to perform in-context learning in these settings. We show that CausalFM achieves competitive in-context learning performance even when compared to baselines that are specifically trained for the task at hand. In sum, our framework can be used as a general recipe to train foundation models for various causal inference settings. In contrast to the current state-of-the-art in causal inference, CausalFM offers a novel paradigm with the potential to fundamentally change how practitioners perform causal inference in medicine, economics, and other disciplines.
arXiv.org Artificial Intelligence
Sep-30-2025
- Country:
- Asia > Vietnam (0.04)
- Europe
- Germany > Bavaria
- Upper Bavaria > Munich (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Germany > Bavaria
- Genre:
- Research Report > Experimental Study (0.46)
- Industry:
- Health & Medicine (1.00)
- Technology: