Parameter Symmetry and Noise Equilibrium of Stochastic Gradient Descent Liu Ziyin Massachusetts Institute of Technology, NTT Research
–Neural Information Processing Systems
Symmetries are prevalent in deep learning and can significantly influence the learning dynamics of neural networks. In this paper, we examine how exponential symmetries - a broad subclass of continuous symmetries present in the model architecture or loss function - interplay with stochastic gradient descent (SGD). We first prove that gradient noise creates a systematic motion (a "Noether flow") of the parameters θ along the degenerate direction to a unique initialization-independent fixed point θ
Neural Information Processing Systems
Feb-17-2026, 08:10:19 GMT
- Country:
- Africa > Middle East
- Tunisia > Ben Arous Governorate > Ben Arous (0.04)
- Asia
- China (0.04)
- Japan > Honshū
- Kantō > Tokyo Metropolis Prefecture > Tokyo (0.04)
- Middle East > Jordan (0.04)
- Europe > Latvia
- Lubāna Municipality > Lubāna (0.04)
- North America > United States
- Massachusetts > Middlesex County > Cambridge (0.04)
- Oceania > Australia
- South America > Brazil
- Rio de Janeiro > Rio de Janeiro (0.04)
- Africa > Middle East
- Genre:
- Research Report > Experimental Study (0.93)
- Industry:
- Information Technology > Services (0.50)