Goto

Collaborating Authors

 Vandeborre, Jean-Philippe


Where Is My Mind (looking at)? Predicting Visual Attention from Brain Activity

arXiv.org Artificial Intelligence

Visual attention estimation is an active field of research at the crossroads of different disciplines: computer vision, artificial intelligence and medicine. One of the most common approaches to estimate a saliency map representing attention is based on the observed images. In this paper, we show that visual attention can be retrieved from EEG acquisition. The results are comparable to traditional predictions from observed images, which is of great interest. For this purpose, a set of signals has been recorded and different models have been developed to study the relationship between visual attention and brain activity. The results are encouraging and comparable with other approaches estimating attention with other modalities. The codes and dataset considered in this paper have been made available at \url{https://figshare.com/s/3e353bd1c621962888ad} to promote research in the field.


Emotion Estimation from EEG -- A Dual Deep Learning Approach Combined with Saliency

arXiv.org Artificial Intelligence

Emotion estimation is an active field of research that has an important impact on the interaction between human and computer. Among the different modality to assess emotion, electroencephalogram (EEG) representing the electrical brain activity presented motivating results during the last decade. Emotion estimation from EEG could help in the diagnosis or rehabilitation of certain diseases. In this paper, we propose a dual method considering the physiological knowledge defined by specialists combined with novel deep learning (DL) models initially dedicated to computer vision. The joint learning has been enhanced with model saliency analysis. To present a global approach, the model has been evaluated on four publicly available datasets and achieves similar results to the state-of-theart approaches and outperforming results for two of the proposed datasets with a lower standard deviation that reflects higher stability. For sake of reproducibility, the codes and models proposed in this paper are available at github.com/VDelv/Emotion-EEG.