Formalized Hopfield Networks and Boltzmann Machines
Cipollina, Matteo, Karatarakis, Michail, Wiedijk, Freek
–arXiv.org Artificial Intelligence
Neural networks are widely used, yet their analysis and verification remain challenging. In this work, we present a Lean 4 formalization of neural networks, covering both deterministic and stochastic models. We first formalize Hopfield networks, recurrent networks that store patterns as stable states. We prove convergence and the correctness of Hebbian learning, a training rule that updates network parameters to encode patterns, here limited to the case of pairwise-orthogonal patterns. We then consider stochastic networks, where updates are probabilistic and convergence is to a stationary distribution. As a canonical example, we formalize the dynamics of Boltzmann machines and prove their ergodicity, showing convergence to a unique stationary distribution using a new formalization of the Perron-Frobenius theorem.
arXiv.org Artificial Intelligence
Dec-9-2025
- Country:
- Asia > Middle East
- Jordan (0.04)
- Europe
- Germany (0.04)
- Italy > Lombardy
- Milan (0.04)
- Netherlands > Gelderland
- Nijmegen (0.04)
- Romania > Sud-Est Development Region
- Tulcea County > Tulcea (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- North America > United States
- Indiana > Madison County
- Anderson (0.04)
- New York > New York County
- New York City (0.04)
- Texas > Harris County
- Houston (0.04)
- Indiana > Madison County
- Asia > Middle East
- Genre:
- Research Report (0.64)
- Technology: