Estimating Scene Flow in Robot Surroundings with Distributed Miniaturized Time-of-Flight Sensors

Sander, Jack, Caroleo, Giammarco, Albini, Alessandro, Maiolino, Perla

arXiv.org Artificial Intelligence 

-- Tracking the motion of humans or objects in a robot's surroundings is essential to improve safe robot motions and reactions. In this work, we present an approach for scene flow estimation from low-density and noisy point clouds acquired from miniaturised Time-of-Flight (T oF) sensors distributed across the robot's body. The proposed method clusters points from consecutive frames and applies the Iterative Closest Point (ICP) algorithm to estimate a dense motion flow, with additional steps introduced to mitigate the impact of sensor noise and low-density data points. Specifically, we employ a fitness-based classification to distinguish between stationary and moving points and an inlier removal strategy to refine geometric correspondences. The proposed approach is validated in an experimental setup where 24 T oF are used to estimate the velocity of an object moving at different controlled speeds. Experimental results show that the method consistently approximates the direction of the motion and its magnitude with an error which is in line with sensor noise. Robots operating in cluttered or shared environments must be aware of their surroundings to plan safe motions effectively. Tracking the motion of nearby humans and obstacles is crucial for detecting and reacting to potential collisions, as well as improving human-robot collaboration [1]-[4].