Deep Transfer Learning with Graph Neural Network for Sensor-Based Human Activity Recognition

Yan, Yan, Liao, Tianzheng, Zhao, Jinjin, Wang, Jiahong, Ma, Liang, Lv, Wei, Xiong, Jing, Wang, Lei

arXiv.org Artificial Intelligence 

Abstract--The sensor-based human activity recognition (HAR) in mobile application scenarios is often confronted with sensor modalities variation and annotated data deficiency. Given this observation, we devised a graph-inspired deep learning approach toward the sensor-based HAR tasks, which was further used to build a deep transfer learning model toward giving a tentative solution for these two challenging problems. Specifically, we present a multi-layer residual structure involved graph convolutional neural network (ResGCNN) toward the sensor-based HAR tasks, namely the HAR-ResGCNN approach. Experimental results on the PAMAP2 and mHealth data sets demonstrate that our ResGCNN is effective at capturing the characteristics of actions with comparable results compared to other sensor-based HAR models (with an average accuracy of 98.18% and 99.07%, respectively). The graph-based framework shows good meta-learning ability and is supposed to be a promising solution in sensor-based HAR tasks. However, this method machines the ability to track the human activity state. HAR is restricted with the impact brought with the complex systems tracks the activity states through processing and scenarios, the uncertainty of the action, and needs to consider learning information from some carriers that can record the privacy problems caused by the camera, and is human actions (such as cameras [1], sensors [2], radars only suitable for some specific scenes.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found