QKFormer: Hierarchical Spiking Transformer using Q-K Attention
–Neural Information Processing Systems
Spiking Transformers, which integrate Spiking Neural Networks (SNNs) with Transformer architectures, have attracted significant attention due to their potential for low energy consumption and high performance. However, there remains a substantial gap in performance between SNNs and Artificial Neural Networks (ANNs). To narrow this gap, we have developed QKFormer, a direct training spiking transformer with the following features: i), the novel spike-form Q-K attention module efficiently models the token or channel attention through binary vectors and enables the construction of larger models.
Neural Information Processing Systems
Dec-24-2025, 03:28:05 GMT
- Technology: