Langevin Flows for Modeling Neural Latent Dynamics
Song, Yue, Keller, T. Anderson, Yue, Yisong, Perona, Pietro, Welling, Max
–arXiv.org Artificial Intelligence
In this work, we introduce LangevinFlow, a sequential Varia-tional Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation. Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and stochastic forces -- to represent both autonomous and non-autonomous processes in neural systems. Crucially, the potential function is parameterized as a network of locally coupled oscillators, biasing the model toward oscillatory and flow-like behaviors observed in biological neural populations. Our model features a recurrent encoder, a one-layer Transformer decoder, and Langevin dynamics in the latent space. Empirically, our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor, closely matching ground-truth firing rates. On the Neural Latents Benchmark (NLB), the model achieves superior held-out neuron likelihoods (bits per spike) and forward prediction accuracy across four challenging datasets. It also matches or surpasses alternative methods in decoding behavioral metrics such as hand velocity. Overall, this work introduces a flexible, physics-inspired, high-performing framework for modeling complex neural population dynamics and their unobserved influences.
arXiv.org Artificial Intelligence
Jul-16-2025
- Country:
- Europe
- Netherlands > North Holland
- Amsterdam (0.04)
- United Kingdom > England
- Oxfordshire > Oxford (0.04)
- Netherlands > North Holland
- North America > United States
- California > Los Angeles County
- Pasadena (0.04)
- Massachusetts > Middlesex County
- Cambridge (0.04)
- California > Los Angeles County
- Europe
- Genre:
- Research Report (0.82)
- Industry:
- Health & Medicine > Therapeutic Area > Neurology (1.00)
- Technology: