Tuning Universality in Deep Neural Networks
–arXiv.org Artificial Intelligence
Deep neural networks (DNNs) exhibit crackling-like avalanches whose origin lacks a mechanistic explanation. Here, I derive a stochastic theory of deep information propagation (DIP) by incorporating Central Limit Theorem (CLT)-level fluctuations. Four effective couplings $(r, h, D_1, D_2)$ characterize the dynamics, yielding a Landau description of the static exponents and a Directed Percolation (DP) structure of activity cascades. Tuning the couplings selects between avalanche dynamics generated by a Brownian Motion (BM) in a logarithmic trap and an absorbed free BM, each corresponding to a distinct universality classes. Numerical simulations confirm the theory and demonstrate that activation function design controls the collective dynamics in random DNNs.
arXiv.org Artificial Intelligence
Dec-2-2025
- Country:
- North America > United States > Indiana > Monroe County > Bloomington (0.04)
- Genre:
- Research Report (0.82)
- Technology: