Prompt a Robot to Walk with Large Language Models
Wang, Yen-Jen, Zhang, Bike, Chen, Jianyu, Sreenath, Koushil
–arXiv.org Artificial Intelligence
Large language models (LLMs) pre-trained on vast internet-scale data have showcased remarkable capabilities across diverse domains. Recently, there has been escalating interest in deploying LLMs for robotics, aiming to harness the power of foundation models in real-world settings. However, this approach faces significant challenges, particularly in grounding these models in the physical world and in generating dynamic robot motions. To address these issues, we introduce a novel paradigm in which we use few-shot prompts collected from the physical environment, enabling the LLM to autoregressively generate low-level control commands for robots without task-specific fine-tuning. Experiments across various robots and environments validate that our method can effectively prompt a robot to walk. We thus illustrate how LLMs can proficiently function as low-level feedback controllers for dynamic motion control even in high-dimensional robotic systems. The project website and source code can be found at: https://prompt2walk.github.io/ .
arXiv.org Artificial Intelligence
Nov-16-2023
- Country:
- North America > United States > California (0.28)
- Genre:
- Research Report > New Finding (0.46)
- Technology: