Backprop with Approximate Activations for Memory-efficient Network Training
Ayan Chakrabarti, Benjamin Moseley
–Neural Information Processing Systems
Training convolutional neural network models is memory intensive since back-propagation requires storing activations of all intermediate layers.
Neural Information Processing Systems
Oct-2-2025, 11:41:49 GMT
- Country:
- North America
- Canada (0.04)
- United States
- Missouri > St. Louis County
- St. Louis (0.04)
- Pennsylvania > Allegheny County
- Pittsburgh (0.04)
- Missouri > St. Louis County
- North America
- Genre:
- Research Report (0.46)
- Technology: