IncepFormerNet: A multi-scale multi-head attention network for SSVEP classification

Huang, Yan, Chen, Yongru, Cao, Lei, Cao, Yongnian, Yang, Xuechun, Dong, Yilin, Liu, Tianyu

arXiv.org Artificial Intelligence 

DL methods have been successfully applied to SSVEP-BCI. This study proposes a new model called IncepFormerNet, which is a hybrid of the Inception and Transformer architectures. IncepFormerNet adeptly extracts multi-scale temporal information from time series data using parallel convolution kernels of varying sizes, accurately capturing the subtle variations and critical features within SSVEP signals.Furthermore, the model integrates the multi-head attention mechanism from the Transformer architecture, which not only provides insights into global dependencies but also significantly enhances the understanding and representation of complex patterns.Additionally, it takes advantage of filter bank techniques to extract features based on the spectral characteristics of SSVEP data. To validate the effectiveness of the proposed model, we conducted experiments on two public datasets, . The experimental results show that IncepFormerNet achieves an accuracy of 87.41% on Dataset 1 and 71.97% on Dataset 2 using a 1.0-second time window.