Progressive Depth Up-scaling via Optimal Transport
Cao, Mingzi, Wang, Xi, Aletras, Nikolaos
–arXiv.org Artificial Intelligence
Depth up-scaling offers training efficiency by adding new layers to pre-trained models. However, most existing methods copy or average weights from base layers, neglecting neuron permutation differences. This limitation can potentially cause misalignment that harms performance. Inspired by applying Optimal Transport (OT) for neuron alignment, we propose Optimal Transport Depth Up-Scaling (OpT -DeUS). OpT -DeUS aligns and fuses Transformer blocks in adjacent base layers via OT for new layer creation, to mitigate neuron permutation mismatch between layers. OpT -DeUS achieves better overall performance and offers improved training efficiency than existing methods for continual pre-training and supervised fine-tuning across different model sizes. To further evaluate the impact of interpolation positions, our extensive analysis shows that inserting new layers closer to the top results in higher training efficiency due to shorter back-propagation time while obtaining additional performance gains.
arXiv.org Artificial Intelligence
Aug-12-2025
- Country:
- Asia > Thailand
- Europe
- North America
- Canada > Quebec
- Montreal (0.04)
- Mexico > Mexico City
- Mexico City (0.04)
- United States
- Minnesota > Hennepin County
- Minneapolis (0.14)
- New Mexico > Bernalillo County
- Albuquerque (0.04)
- Minnesota > Hennepin County
- Canada > Quebec
- South America > Colombia
- Meta Department > Villavicencio (0.04)
- Genre:
- Research Report (0.82)
- Industry:
- Energy (0.47)
- Technology: