Text2Weight: Bridging Natural Language and Neural Network Weight Spaces
Tian, Bowen, Chen, Wenshuo, Li, Zexi, Lai, Songning, Wu, Jiemin, Yue, Yutao
–arXiv.org Artificial Intelligence
How far are we really from automatically generating neural networks? While neural network weight generation shows promise, current approaches struggle with generalization to unseen tasks and practical application exploration. To address this, we propose T2W, a diffusion transformer framework that generates task-specific weights conditioned on natural language descriptions. T2W hierarchically processes network parameters into uniform blocks, integrates text embeddings from CLIP via a prior attention mechanism, and employs adversarial training with weight-space augmentation to enhance generalization. Experiments on Cifar100, Caltech256, and TinyImageNet demonstrate T2W's ability to produce high-quality weights for unseen tasks, outperforming optimization-based initialization and enabling novel applications such as weight enhancement and text-guided model fusion. Our work bridges textual semantics with weight-space dynamics, supported by an open-source dataset of text-weight pairs, advancing the practicality of generative models in neural network parameter synthesis. Our code is available on Github.
arXiv.org Artificial Intelligence
Aug-20-2025
- Country:
- Asia > China
- Guangdong Province > Guangzhou (0.05)
- Hong Kong (0.40)
- Zhejiang Province > Hangzhou (0.04)
- Europe
- Ireland > Leinster
- County Dublin > Dublin (0.06)
- United Kingdom > England
- Cambridgeshire > Cambridge (0.14)
- Ireland > Leinster
- North America > United States
- New York > New York County > New York City (0.04)
- Asia > China
- Genre:
- Overview (0.34)
- Research Report (0.50)
- Technology: