Goto

Collaborating Authors

 Büsching, Marcel


Cloth-Splatting: 3D Cloth State Estimation from RGB Supervision

arXiv.org Artificial Intelligence

Teaching robots to fold, drape, or manipulate deformable objects such as cloths is fundamental to unlock a variety of applications ranging from healthcare to domestic and industrial environments [1]. While considerable progress has been made in rigid-object manipulation, manipulating deformables poses unique challenges, including infinite-dimensional state spaces, complex physical dynamics, and state estimation of self-occluded configurations [2]. Specifically, the problem of state estimation has led existing works on visual manipulation to either rely exclusively on 2D images, overlooking the cloth's 3D structure [3, 4, 5], or to use 3D representations that neglect valuable information in RGB observations [6, 7, 8]. Prior work on cloth state estimation often relies on 3D particle-based representations derived from depth sensors, including graphs [9, 10] and point clouds [11]. While point clouds effectively capture the object's observable state, they lack comprehensive structural information [6].