Thelxino\"e: Recognizing Human Emotions Using Pupillometry and Machine Learning

Barker, Darlene, Levkowitz, Haim

arXiv.org Artificial Intelligence 

In this study, we present a method for emotion recognition in Virtual Reality (VR) using pupillometry. We analyze pupil diameter responses to both visual and auditory stimuli via a VR headset and focus on extracting key features in the time-domain, frequency-domain, and time-frequency domain from VRgenerated data. Our approach utilizes feature selection to identify the most impactful features using Maximum Relevance Minimum Redundancy (mRMR). By applying a Gradient Boosting model, an ensemble learning technique using stacked decision trees, we achieve an accuracy of 98.8% with feature engineering, compared to 84.9% without it. This research contributes significantly to the Thelxinoë framework, aiming to enhance VR experiences by integrating multiple sensor data for realistic and emotionally resonant touch interactions. NTRODUCTION In a poetic sense, the eyes have long been regarded as the "window into the soul" offering a glimpse into the depths of human emotions and experiences [1]. In the realm of modern technology, this poetic vision transforms into a scientific reality, particularly in VR. The "pupils" serve as gateways not just "to the brain" but to the autonomic nervous system which subtly dilates and contracts in response to our emotions [1].

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found