GDTM: An Indoor Geospatial Tracking Dataset with Distributed Multimodal Sensors

Jeong, Ho Lyun, Wang, Ziqi, Samplawski, Colin, Wu, Jason, Fang, Shiwei, Kaplan, Lance M., Ganesan, Deepak, Marlin, Benjamin, Srivastava, Mani

arXiv.org Artificial Intelligence 

Constantly locating moving objects, i.e., geospatial tracking, is essential for autonomous building infrastructure. Accurate and robust geospatial tracking often leverages multimodal sensor fusion algorithms, which require large datasets with time-aligned, synchronized data from various sensor types. However, such datasets are not readily available. Hence, we propose GDTM, a nine-hour dataset for multimodal object tracking with distributed multimodal sensors and reconfigurable sensor node placements. Our dataset enables the exploration of several research problems, such as optimizing architectures for processing multimodal data, and investigating models' robustness to adverse sensing conditions and sensor placement variances. A GitHub repository containing the code, sample data, and checkpoints of this work is available at https://github.com/nesl/GDTM.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found