Goto

Collaborating Authors

 activity intensity


ActiNet: Activity intensity classification of wrist-worn accelerometers using self-supervised deep learning

Acquah, Aidan, Chan, Shing, Doherty, Aiden

arXiv.org Artificial Intelligence

The use of reliable and accurate human activity recognition (HAR) models on passively collected wrist - accelerometer data is essential in large - scale epidemiological studies that investigate the association between physical activity and health outcomes . While the use of self - supervised learning has generated considerable e xcitement in improving HAR, it remains unknown to what extent th ese models, coupled with hidden Markov models (HMMs), would make a tangible improvement to classification performance and the effect this may have on the predicted daily activity intensity compositions . Us ing 151 CAPTURE - 24 participants' data, we trained the ActiNet model, a self - supervised, 18 - layer, modified ResNet - V2 model, followed by hidden Markov model (HMM) smoothing to classify labels of activity intensity . The performance of this model, evaluated using 5 - fold stratified group cross - validation, was then compared to a baseline random forest (RF) + HMM, established in existing literature . Differences in performance and classification outputs were compared with different subgroups of age and sex within the Capture - 24 population. The ActiNet model was able to distinguish labels of activity intensity with a mean macro F1 score of 0.82 and a mean Cohen's kappa score of 0.86 . This exceeded the performance of the RF + HMM, trained and validated on the same dataset, with mean scores of 0.77 and 0.81, respectively . These findings were consistent across subgroups of age and sex. These findings encourage the use of ActiNet for the extraction of activity intensity labels from wrist - accelerometer data in future epidemiological studies.


The Intelligent ICU Pilot Study: Using Artificial Intelligence Technology for Autonomous Patient Monitoring

Davoudi, Anis, Malhotra, Kumar Rohit, Shickel, Benjamin, Siegel, Scott, Williams, Seth, Ruppert, Matthew, Bihorac, Emel, Ozrazgat-Baslanti, Tezcan, Tighe, Patrick J., Bihorac, Azra, Rashidi, Parisa

arXiv.org Artificial Intelligence

Currently, many critical care indices are repetitively assessed and recorded by overburdened nurses, e.g. physical function or facial pain expressions of nonverbal patients. In addition, many essential information on patients and their environment are not captured at all, or are captured in a non-granular manner, e.g. sleep disturbance factors such as bright light, loud background noise, or excessive visitations. In this pilot study, we examined the feasibility of using pervasive sensing technology and artificial intelligence for autonomous and granular monitoring of critically ill patients and their environment in the Intensive Care Unit (ICU). As an exemplar prevalent condition, we also characterized delirious and non-delirious patients and their environment. We used wearable sensors, light and sound sensors, and a high-resolution camera to collected data on patients and their environment. We analyzed collected data using deep learning and statistical analysis. Our system performed face detection, face recognition, facial action unit detection, head pose detection, facial expression recognition, posture recognition, actigraphy analysis, sound pressure and light level detection, and visitation frequency detection. We were able to detect patient's face (Mean average precision (mAP)=0.94), recognize patient's face (mAP=0.80), and their postures (F1=0.94). We also found that all facial expressions, 11 activity features, visitation frequency during the day, visitation frequency during the night, light levels, and sound pressure levels during the night were significantly different between delirious and non-delirious patients (p-value<0.05). In summary, we showed that granular and autonomous monitoring of critically ill patients and their environment is feasible and can be used for characterizing critical care conditions and related environment factors.