Emotional Reaction Intensity Estimation Based on Multimodal Data

Wang, Shangfei, Wu, Jiaqiang, Zheng, Feiyi, Li, Xin, Li, Xuewei, Wang, Suwen, Wu, Yi, Chang, Yanan, Miao, Xiangyu

arXiv.org Artificial Intelligence 

This paper introduces our method for the Emotional Reaction Intensity (ERI) Estimation Challenge, in CVPR 2023: 5th Workshop and Competition on Affective Behavior Analysis in-the-wild (ABAW). Based on the multimodal data provided by the originazers, we extract acoustic and visual features with different pretrained models. The multimodal features are mixed together by Transformer Encoders with cross-modal attention mechnism. In this paper, 1. better features are extracted with the SOTA pretrained models. 2. Compared with the baseline, we improve the Pearson's Correlations Coefficient a lot. 3. We process the data with some special skills to enhance performance ability of our model.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found