Symmetry in Neural Network Parameter Spaces
Zhao, Bo, Walters, Robin, Yu, Rose
–arXiv.org Artificial Intelligence
Modern deep learning models are highly overparameterized, resulting in large sets of parameter configurations that yield the same outputs. A significant portion of this redundancy is explained by symmetries in the parameter space--transformations that leave the network function unchanged. These symmetries shape the loss landscape and constrain learning dynamics, offering a new lens for understanding optimization, generalization, and model complexity that complements existing theory of deep learning. This survey provides an overview of parameter space symmetry. We summarize existing literature, uncover connections between symmetry and learning theory, and identify gaps and opportunities in this emerging field.
arXiv.org Artificial Intelligence
Dec-12-2025
- Country:
- Asia
- Europe
- Germany > Lower Saxony
- Gottingen (0.04)
- Italy (0.04)
- Latvia > Lubāna Municipality
- Lubāna (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Germany > Lower Saxony
- North America > United States
- California > San Diego County > San Diego (0.04)
- Genre:
- Overview (1.00)
- Research Report (0.64)
- Industry:
- Technology: