QKFormer: Hierarchical Spiking Transformer using Q-K Attention
–Neural Information Processing Systems
Spiking Transformers, which integrate Spiking Neural Networks (SNNs) with Transformer architectures, have attracted significant attention due to their potential for low energy consumption and high performance. However, there remains a substantial gap in performance between SNNs and Artificial Neural Networks (ANNs). To narrow this gap, we have developed QKFormer, a direct training spiking transformer with the following features: i) Linear complexity and high energy efficiency, the novel spike-form Q-K attention module efficiently models the token or channel attention through binary vectors and enables the construction of larger models.
Neural Information Processing Systems
May-28-2025, 14:33:57 GMT
- Country:
- Asia > China (0.14)
- Europe > Netherlands (0.14)
- Genre:
- Research Report
- Experimental Study (0.93)
- New Finding (0.67)
- Research Report
- Industry:
- Energy (0.48)
- Information Technology (0.46)
- Technology: