Bulk-boundary decomposition of neural networks

Lee, Donghee, Lee, Hye-Sung, Yi, Jaeok

arXiv.org Artificial Intelligence 

Department of Physics, Korea Advanced Institute of Science and Technology, Daejeon 34141, Korea (Dated: November 2025) We present the bulk-boundary decomposition as a new framework for understanding the training dynamics of deep neural networks. Starting from the stochastic gradient descent formulation, we show that the Lagrangian can be reorganized into a data-independent bulk term and a data-dependent boundary term. The bulk captures the intrinsic dynamics set by network architecture and activation functions, while the boundary reflects stochastic interactions from training samples at the input and output layers. As a natural extension, we develop a field-theoretic formulation of neural dynamics based on this decomposition. Introduction-- Deep neural networks have achieved remarkable empirical success across diverse domains, yet the fundamental principles governing their learning dynamics remain unclear [1-3].

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found