ManufactuBERT: Efficient Continual Pretraining for Manufacturing
Armingaud, Robin, Besançon, Romaric
–arXiv.org Artificial Intelligence
While large general-purpose Transformer-based encoders excel at general language understanding, their performance diminishes in specialized domains like manufacturing due to a lack of exposure to domain-specific terminology and semantics. In this paper, we address this gap by introducing ManufactuBERT, a RoBERTa model continually pretrained on a large-scale corpus curated for the manufacturing domain. We present a comprehensive data processing pipeline to create this corpus from web data, involving an initial domain-specific filtering step followed by a multi-stage deduplication process that removes redundancies. Our experiments show that ManufactuBERT establishes a new state-of-the-art on a range of manufacturing-related NLP tasks, outperforming strong specialized baselines. More importantly, we demonstrate that training on our carefully deduplicated corpus significantly accelerates convergence, leading to a 33\% reduction in training time and computational cost compared to training on the non-deduplicated dataset. The proposed pipeline offers a reproducible example for developing high-performing encoders in other specialized domains. We will release our model and curated corpus at https://huggingface.co/cea-list-ia.
arXiv.org Artificial Intelligence
Nov-10-2025
- Country:
- Europe
- Belgium > Brussels-Capital Region
- Brussels (0.04)
- France
- Bourgogne-Franche-Comté > Doubs
- Besançon (0.04)
- Île-de-France (0.04)
- Bourgogne-Franche-Comté > Doubs
- Italy > Tuscany
- Florence (0.04)
- Spain > Valencian Community
- Valencia Province > Valencia (0.04)
- Belgium > Brussels-Capital Region
- North America
- Dominican Republic (0.04)
- United States
- Florida > Miami-Dade County
- Miami (0.04)
- Minnesota > Hennepin County
- Minneapolis (0.14)
- Florida > Miami-Dade County
- Europe
- Genre:
- Research Report > New Finding (1.00)
- Industry:
- Information Technology (0.34)
- Technology: