Goto

Collaborating Authors

 convention


Collective Kernel EFT for Pre-activation ResNets

Kawase, Hidetoshi, Ota, Toshihiro

arXiv.org Machine Learning

In finite-width deep neural networks, the empirical kernel $G$ evolves stochastically across layers. We develop a collective kernel effective field theory (EFT) for pre-activation ResNets based on a $G$-only closure hierarchy and diagnose its finite validity window. Exploiting the exact conditional Gaussianity of residual increments, we derive an exact stochastic recursion for $G$. Applying Gaussian approximations systematically yields a continuous-depth ODE system for the mean kernel $K_0$, the kernel covariance $V_4$, and the $1/n$ mean correction $K_{1,\mathrm{EFT}}$, which emerges diagrammatically as a one-loop tadpole correction. Numerically, $K_0$ remains accurate at all depths. However, the $V_4$ equation residual accumulates to an $O(1)$ error at finite time, primarily driven by approximation errors in the $G$-only transport term. Furthermore, $K_{1,\mathrm{EFT}}$ fails due to the breakdown of the source closure, which exhibits a systematic mismatch even at initialization. These findings highlight the limitations of $G$-only state-space reduction and suggest extending the state space to incorporate the sigma-kernel.








EmergentGraphicalConventionsin aVisualCommunicationGame

Neural Information Processing Systems

Due to itsiconic nature (i.e., perceptual resemblance to or natural association with the referent), drawings serve as a powerful tool to communicate concepts transcending language barriers (Fay et al., 2014). In fact, we humans started to use drawings to convey messages dating back to 40,000-60,000 years ago (Hoffmann et al., 2018; Hawkins et al., 2019).