Modular MeanFlow: Towards Stable and Scalable One-Step Generative Modeling
You, Haochen, Liu, Baojing, He, Hongyang
–arXiv.org Artificial Intelligence
One-step generative modeling seeks to generate high-quality data samples in a single function evaluation, significantly improving efficiency over traditional diffusion or flow-based models. In this work, we introduce Modular MeanFlow (MMF), a flexible and theoretically grounded approach for learning time-averaged velocity fields. Our method derives a family of loss functions based on a differential identity linking instantaneous and average velocities, and incorporates a gradient modulation mechanism that enables stable training without sacrificing expressiveness. We further propose a curriculum-style warmup schedule to smoothly transition from coarse supervision to fully differentiable training. The MMF formulation unifies and generalizes existing consistency-based and flow-matching methods, while avoiding expensive higher-order derivatives. Empirical results across image synthesis and trajectory modeling tasks demonstrate that MMF achieves competitive sample quality, robust convergence, and strong generalization, particularly under low-data or out-of-distribution settings.
arXiv.org Artificial Intelligence
Aug-26-2025
- Country:
- Asia > China
- Hebei Province > Shijiazhuang (0.04)
- Europe > United Kingdom
- England > West Midlands > Coventry (0.04)
- North America > United States
- New York (0.04)
- South America > Colombia (0.04)
- Asia > China
- Genre:
- Research Report > New Finding (0.46)
- Technology:
- Information Technology > Artificial Intelligence
- Machine Learning > Neural Networks (0.68)
- Representation & Reasoning (1.00)
- Vision (0.89)
- Information Technology > Artificial Intelligence