Integral Signatures of Activation Functions: A 9-Dimensional Taxonomy and Stability Theory for Deep Learning
Mali, Ankur, Hall, Lawrence, Williams, Jake, Richards, Gordon
–arXiv.org Artificial Intelligence
Activation functions govern the expressivity and stability of neural networks, yet existing comparisons remain largely heuristic. We propose a rigorous framework for their classification via a nine-dimensional integral signature S_sigma(phi), combining Gaussian propagation statistics (m1, g1, g2, m2, eta), asymptotic slopes (alpha_plus, alpha_minus), and regularity measures (TV(phi'), C(phi)). This taxonomy establishes well-posedness, affine reparameterization laws with bias, and closure under bounded slope variation. Dynamical analysis yields Lyapunov theorems with explicit descent constants and identifies variance stability regions through (m2', g2). From a kernel perspective, we derive dimension-free Hessian bounds and connect smoothness to bounded variation of phi'. Applying the framework, we classify eight standard activations (ReLU, leaky-ReLU, tanh, sigmoid, Swish, GELU, Mish, TeLU), proving sharp distinctions between saturating, linear-growth, and smooth families. Numerical Gauss-Hermite and Monte Carlo validation confirms theoretical predictions. Our framework provides principled design guidance, moving activation choice from trial-and-error to provable stability and kernel conditioning.
arXiv.org Artificial Intelligence
Oct-10-2025
- Country:
- Africa > Mali (0.04)
- Asia > Japan
- Honshū > Tōhoku > Fukushima Prefecture > Fukushima (0.04)
- North America > United States
- Florida > Hillsborough County
- Tampa (0.14)
- Georgia > Fulton County
- Atlanta (0.04)
- Florida > Hillsborough County
- Genre:
- Research Report (0.40)
- Technology: