Synheart Emotion: Privacy-Preserving On-Device Emotion Recognition from Biosignals

Ademtew, Henok, Goytom, Israel

arXiv.org Artificial Intelligence 

Human emotions fundamentally shape decision-making, social interactions, and cognitive processes. Modern human-computer interaction (HCI) systems, however, remain largely oblivious to users' affective states, relying exclusively on explicit inputs such as touch, speech, or gaze. The proliferation of consumer wearables such as smartwatches, fitness trackers, and health monitors has democratized access to continuous physiological data, creating unprecedented opportunities for emotionally intelligent computing [1, 2]. Physiological signals offer several advantages over traditional modalities (facial expressions, voice) for emotion recognition: they are continuous, difficult to consciously manipulate, and unaffected by environmental factors such as lighting or occlusion [3]. Among these signals, heart rate variability (HR V), the temporal variation between consecutive heartbeats, has emerged as a robust biomarker of autonomic nervous system activity and emotional states [4, 5]. Despite significant research advances in affective computing, most emotion recognition systems exhibit two critical limitations: 1. Privacy vulnerabilities: Cloud-based inference requires transmitting sensitive bio-metric data to external servers, exposing users to data breaches, surveillance, and loss of autonomy [6].