Large Language Bayes
–arXiv.org Artificial Intelligence
Many domain experts do not have the time or expertise to write formal Bayesian models. This paper takes an informal problem description as input, and combines a large language model and a probabilistic programming language to define a joint distribution over formal models, latent variables, and data. A posterior over latent variables follows by conditioning on observed data and integrating over formal models. This presents a challenging inference problem. We suggest an inference recipe that amounts to generating many formal models from the large language model, performing approximate inference on each, and then doing a weighted average. This is justified and analyzed as a combination of self-normalized importance sampling, MCMC, and importance-weighted variational inference. Experimentally, this produces sensible predictions from only data and an informal problem description, without the need to specify a formal model.
arXiv.org Artificial Intelligence
Oct-27-2025
- Country:
- Europe > Sweden
- Östergötland County > Linköping (0.04)
- North America > United States
- Massachusetts > Hampshire County
- Amherst (0.04)
- New Jersey > Hudson County
- Hoboken (0.04)
- Massachusetts > Hampshire County
- Europe > Sweden
- Genre:
- Research Report > Experimental Study (0.46)
- Technology: