Bulk-boundary decomposition of neural networks
Lee, Donghee, Lee, Hye-Sung, Yi, Jaeok
–arXiv.org Artificial Intelligence
Department of Physics, Korea Advanced Institute of Science and Technology, Daejeon 34141, Korea (Dated: November 2025) We present the bulk-boundary decomposition as a new framework for understanding the training dynamics of deep neural networks. Starting from the stochastic gradient descent formulation, we show that the Lagrangian can be reorganized into a data-independent bulk term and a data-dependent boundary term. The bulk captures the intrinsic dynamics set by network architecture and activation functions, while the boundary reflects stochastic interactions from training samples at the input and output layers. As a natural extension, we develop a field-theoretic formulation of neural dynamics based on this decomposition. Introduction-- Deep neural networks have achieved remarkable empirical success across diverse domains, yet the fundamental principles governing their learning dynamics remain unclear [1-3].
arXiv.org Artificial Intelligence
Nov-5-2025
- Country:
- Asia
- Singapore (0.04)
- South Korea > Daejeon
- Daejeon (0.24)
- Europe > United Kingdom
- England
- Cambridgeshire > Cambridge (0.04)
- Oxfordshire > Oxford (0.04)
- England
- North America > United States
- California > San Mateo County
- Redwood City (0.04)
- New Jersey > Middlesex County
- Piscataway (0.04)
- California > San Mateo County
- Asia
- Genre:
- Research Report (0.40)
- Industry:
- Automobiles & Trucks > Parts Supplier (0.40)
- Technology: