wyffels, Francis
Evaluating Text-to-Image Diffusion Models for Texturing Synthetic Data
Lips, Thomas, wyffels, Francis
Building generic robotic manipulation systems often requires large amounts of real-world data, which can be dificult to collect. Synthetic data generation offers a promising alternative, but limiting the sim-to-real gap requires significant engineering efforts. To reduce this engineering effort, we investigate the use of pretrained text-to-image diffusion models for texturing synthetic images and compare this approach with using random textures, a common domain randomization technique in synthetic data generation. We focus on generating object-centric representations, such as keypoints and segmentation masks, which are important for robotic manipulation and require precise annotations. We evaluate the efficacy of the texturing methods by training models on the synthetic data and measuring their performance on real-world datasets for three object categories: shoes, T-shirts, and mugs. Surprisingly, we find that texturing using a diffusion model performs on par with random textures, despite generating seemingly more realistic images. Our results suggest that, for now, using diffusion models for texturing does not benefit synthetic data generation for robotics. The code, data and trained models are available at \url{https://github.com/tlpss/diffusing-synthetic-data.git}.
Automatic Calibration for an Open-source Magnetic Tactile Sensor
Stockt, Lowiek Van den, Proesmans, Remko, wyffels, Francis
Tactile sensing can enable robots to perform complex, contact-rich tasks. Magnetic sensors offer accurate three-axis force measurements while using affordable materials. Calibrating such a sensor involves either manual data collection, or automated procedures with precise mounting of the sensor relative to an actuator. We present an open-source magnetic tactile sensor with an automatic, in situ, gripper-agnostic calibration method, after which the sensor is immediately ready for use. Our goal is to lower the barrier to entry for tactile sensing, fostering collaboration in robotics. Design files and readout code can be found at https://github.com/LowiekVDS/Open-source-Magnetic-Tactile-Sensor}{https://github.com/LowiekVDS/Open-source-Magnetic-Tactile-Sensor.
KeyCLD: Learning Constrained Lagrangian Dynamics in Keypoint Coordinates from Images
Daems, Rembert, Taets, Jeroen, wyffels, Francis, Crevecoeur, Guillaume
We present KeyCLD, a framework to learn Lagrangian dynamics from images. Learned keypoints represent semantic landmarks in images and can directly represent state dynamics. We show that interpreting this state as Cartesian coordinates, coupled with explicit holonomic constraints, allows expressing the dynamics with a constrained Lagrangian. KeyCLD is trained unsupervised end-to-end on sequences of images. Our method explicitly models the mass matrix, potential energy and the input matrix, thus allowing energy based control. We demonstrate learning of Lagrangian dynamics from images on the dm_control pendulum, cartpole and acrobot environments. KeyCLD can be learned on these systems, whether they are unactuated, underactuated or fully actuated. Trained models are able to produce long-term video predictions, showing that the dynamics are accurately learned. We compare with Lag-VAE, Lag-caVAE and HGN, and investigate the benefit of the Lagrangian prior and the constraint function. KeyCLD achieves the highest valid prediction time on all benchmarks. Additionally, a very straightforward energy shaping controller is successfully applied on the fully actuated systems. Please refer to our project page for code and additional results: https://rdaems.github.io/keycld/
Seamless Integration of Tactile Sensors for Cobots
Proesmans, Remko, wyffels, Francis
Abstract-- The development of tactile sensing is expected to enhance robotic systems in handling complex objects like deformables or reflective materials. However, readily available industrial grippers generally lack tactile feedback, which has led researchers to develop their own tactile sensors, resulting in a wide range of sensor hardware. Reading data from these sensors poses an integration challenge: either external wires must be routed along the robotic arm, or a wireless processing unit has to be fixed to the robot, increasing its size. We have developed a microcontroller-based sensor readout solution that seamlessly integrates with Robotiq grippers. Our Arduino compatible design takes away a major part of the integration complexity of tactile sensors and can serve as a valuable accelerator of research in the field.
Augmenting Off-the-Shelf Grippers with Tactile Sensing
Proesmans, Remko, wyffels, Francis
The development of tactile sensing and its fusion with computer vision is expected to enhance robotic systems in handling complex tasks like deformable object manipulation. However, readily available industrial grippers typically lack tactile feedback, which has led researchers to develop and integrate their own tactile sensors. This has resulted in a wide range of sensor hardware, making it difficult to compare performance between different systems. We highlight the value of accessible open-source sensors and present a set of fingertips specifically designed for fine object manipulation, with readily interpretable data outputs. The fingertips are validated through two difficult tasks: cloth edge tracing and cable tracing. Videos of these demonstrations, as well as design files and readout code can be found at https://github.com/RemkoPr/icra-2023-workshop-tactile-fingertips.
Revisiting Proprioceptive Sensing for Articulated Object Manipulation
Lips, Thomas, wyffels, Francis
Robots that assist humans will need to interact with articulated objects such as cabinets or microwaves. Early work on creating systems for doing so used proprioceptive sensing to estimate joint mechanisms during contact. However, nowadays, almost all systems use only vision and no longer consider proprioceptive information during contact. We believe that proprioceptive information during contact is a valuable source of information and did not find clear motivation for not using it in the literature. Therefore, in this paper, we create a system that, starting from a given grasp, uses proprioceptive sensing to open cabinets with a position-controlled robot and a parallel gripper. We perform a qualitative evaluation of this system, where we find that slip between the gripper and handle limits the performance. Nonetheless, we find that the system already performs quite well. This poses the question: should we make more use of proprioceptive information during contact in articulated object manipulation systems, or is it not worth the added complexity, and can we manage with vision alone? We do not have an answer to this question, but we hope to spark some discussion on the matter. The codebase and videos of the system are available at https://tlpss.github.io/revisiting-proprioception-for-articulated-manipulation/.
A Differentiable Physics Engine for Deep Learning in Robotics
Degrave, Jonas, Hermans, Michiel, Dambre, Joni, wyffels, Francis
An important field in robotics is the optimization of controllers. Currently, robots are often treated as a black box in this optimization process, which is the reason why derivative-free optimization methods such as evolutionary algorithms or reinforcement learning are omnipresent. When gradient-based methods are used, models are kept small or rely on finite difference approximations for the Jacobian. This method quickly grows expensive with increasing numbers of parameters, such as found in deep learning. We propose the implementation of a modern physics engine, which can differentiate control parameters. This engine is implemented for both CPU and GPU. Firstly, this paper shows how such an engine speeds up the optimization process, even for small problems. Furthermore, it explains why this is an alternative approach to deep Q-learning, for using deep learning in robotics. Finally, we argue that this is a big step for deep learning in robotics, as it opens up new possibilities to optimize robots, both in hardware and software.