Gradient Flossing: Improving Gradient Descent through Dynamic Control of Jacobians
–Neural Information Processing Systems
Training recurrent neural networks (RNNs) remains a challenge due to the instability of gradients across long time horizons, which can lead to exploding and vanishing gradients. Recent research has linked these problems to the values of Lyapunov exponents for the forward-dynamics, which describe the growth or shrinkage of infinitesimal perturbations. Here, we propose gradient flossing, a novel approach to tackling gradient instability by pushing Lyapunov exponents of the forward dynamics toward zero during learning.
Neural Information Processing Systems
Nov-14-2025, 05:41:43 GMT
- Country:
- Asia > Middle East
- Jordan (0.04)
- Europe
- Czechia > South Moravian Region
- Brno (0.04)
- Germany > North Rhine-Westphalia
- Cologne Region > Bonn (0.04)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.04)
- Czechia > South Moravian Region
- North America > United States
- Georgia > Fulton County
- Atlanta (0.04)
- Massachusetts > Middlesex County
- Cambridge (0.04)
- New Jersey > Bergen County
- Hackensack (0.04)
- New York > Suffolk County
- Stony Brook (0.04)
- Georgia > Fulton County
- Asia > Middle East
- Genre:
- Research Report > New Finding (0.67)
- Industry:
- Health & Medicine (0.46)
- Technology: