Scaling and Resizing Symmetry in Feedforward Networks
–arXiv.org Artificial Intelligence
Weights initialization in deep neural networks have a strong impact on the speed of converge of the learning map. Recent studies have shown that in the case of random initializations, a chaos/order phase transition occur in the space of variances of random weights and biases. Experiments then had shown that large improvements can be made, in terms of the training speed, if a neural network is initialized on values along the critical line of such phase transition. In this contribution, we show evidence that the scaling property exhibited by physical systems at criticality, is also present in untrained feedforward networks with random weights initialization at the critical line. Additionally, we suggest an additional data-resizing symmetry, which is directly inherited from the scaling symmetry at criticality.
arXiv.org Artificial Intelligence
Jun-26-2023
- Country:
- Africa > Middle East
- Tunisia > Ben Arous Governorate > Ben Arous (0.04)
- Europe > Italy
- Calabria > Catanzaro Province
- Catanzaro (0.04)
- Sardinia (0.04)
- Calabria > Catanzaro Province
- North America > United States
- Louisiana > Orleans Parish
- New Orleans (0.04)
- Ohio > Cuyahoga County
- Cleveland (0.04)
- Louisiana > Orleans Parish
- Africa > Middle East
- Genre:
- Research Report (1.00)
- Technology: