Sample as You Infer: Predictive Coding With Langevin Dynamics

Zahid, Umais, Guo, Qinghai, Fountas, Zafeirios

arXiv.org Artificial Intelligence 

It is well known that neuronal systems, including their dynamics and responses, are rife with noise at multiple levels (Faisal et al., 2008; Shadlen & Newsome, 1998). These sources of noise arise from, amongst other things, stochastic processes occuring at the sub-cellular level, impacting neuronal response through, for example, fluctuations in membrane-potential (Derksen & Verveen, 1966). Yet the precise role of such randomness, in information processing, continues to be an open question (McDonnell & Ward, 2011; Deco et al., 2013). The Langevin PC algorithm suggests one such role may be in the principled exploration of the latent space of hypotheses under one's generative model. Secondly, from the perspective of Langevin PC as an in-silico generative modelling algorithm we note a number of interesting avenues that we have not had the time to explore here. These include: Models with a hierarchy of stochastic variables, such as those found in most state of the art VAE models (Child, 2021; Vahdat & Kautz, 2021; Hazami et al., 2022). Which may require adopting a corresponding top-down hierarchical warm-start model. Automatic convergence criteria for determining when our Markov chain has converged to a certain level of error (Roy, 2020). Underdamped Langevin dynamics, which incorporate auxiliary momentum variables into the Langevin sampling to achieve an accelerated rate of convergence (Cheng et al., 2018; Ma et al., 2019).