kef
Finding separatrices of dynamical flows with Deep Koopman Eigenfunctions
Dabholkar, Kabir V., Barak, Omri
Many natural systems, including neural circuits involved in decision making, are modeled as high-dimensional dynamical systems with multiple stable states. While existing analytical tools primarily describe behavior near stable equilibria, characterizing separatrices--the manifolds that delineate boundaries between different basins of attraction--remains challenging, particularly in high-dimensional settings. Here, we introduce a numerical framework leveraging Koopman Theory combined with Deep Neural Networks to effectively characterize separatrices. Specifically, we approximate Koopman Eigenfunctions (KEFs) associated with real positive eigenvalues, which vanish precisely at the separatrices. Utilizing these scalar KEFs, optimization methods efficiently locate separatrices even in complex systems. We demonstrate our approach on synthetic benchmarks, ecological network models, and high-dimensional recurrent neural networks trained on either neuroscience-inspired tasks or fit to real neural data. Moreover, we illustrate the practical utility of our method by designing optimal perturbations that can shift systems across separatrices, enabling predictions relevant to optogenetic stimulation experiments in neuroscience.
- North America > United States (0.14)
- Asia > Middle East > Israel > Haifa District > Haifa (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
Variational principle to regularize machine-learned density functionals: the non-interacting kinetic-energy functional
del Mazo-Sevillano, P., Hermann, J.
Practical density functional theory (DFT) owes its success to the groundbreaking work of Kohn and Sham that introduced the exact calculation of the non-interacting kinetic energy of the electrons using an auxiliary mean-field system. However, the full power of DFT will not be unleashed until the exact relationship between the electron density and the non-interacting kinetic energy is found. Various attempts have been made to approximate this functional, similar to the exchange--correlation functional, with much less success due to the larger contribution of kinetic energy and its more non-local nature. In this work we propose a new and efficient regularization method to train density functionals based on deep neural networks, with particular interest in the kinetic-energy functional. The method is tested on (effectively) one-dimensional systems, including the hydrogen chain, non-interacting electrons, and atoms of the first two periods, with excellent results. For the atomic systems, the generalizability of the regularization method is demonstrated by training also an exchange--correlation functional, and the contrasting nature of the two functionals is discussed from a machine-learning perspective.