Arcadia: Toward a Full-Lifecycle Framework for Embodied Lifelong Learning

Gao, Minghe, Li, Juncheng, Lin, Yuze, Liu, Xuqi, Ji, Jiaming, Pan, Xiaoran, Xu, Zihan, Li, Xian, Li, Mingjie, Ji, Wei, Wei, Rong, Tang, Rui, Wang, Qizhou, Shen, Kai, Xiao, Jun, Wu, Qi, Tang, Siliang, Zhuang, Yueting

arXiv.org Artificial Intelligence 

W e contend that embodied learning is fundamentally a life-cycle problem rather than a single-stage optimization. Systems that optimize only one link (data collection, simulation, learning, or deployment) rarely sustain improvement or generalize beyond narrow settings. W e introduce Arcadia, a closed-loop framework that operational-izes embodied lifelong learning by tightly coupling four stages: (1) Self-evolving exploration and grounding for autonomous data acquisition in physical environments, (2) Generative scene reconstruction and augmentation for realistic and extensible scene creation, (3) a Shared embodied representation architecture that unifies navigation and manipulation within a single multimodal backbone, and (4) Sim-from-real evaluation and evolution that closes the feedback loop through simulation-based adaptation. This coupling is non-decomposable: removing any stage breaks the improvement loop and reverts to one-shot training. Arcadia delivers consistent gains on navigation and manipulation benchmarks and transfers robustly to physical robots, indicating that a tightly coupled lifecycle: continuous real-world data acquisition, generative simulation update, and shared-representation learning, supports lifelong improvement and end-to-end generalization. W e release standardized interfaces enabling reproducible evaluation and cross-model comparison in reusable environments, positioning Arcadia as a scalable foundation for general-purpose embodied agents.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found