MambaLRP: Explaining Selective State Space Sequence Models Klaus-Robert Müller 1,2,4,5,6 Oliver Eberle

Neural Information Processing Systems 

Recent sequence modeling approaches using selective state space sequence models, referred to as Mamba models, have seen a surge of interest. These models allow efficient processing of long sequences in linear time and are rapidly being adopted in a wide range of applications such as language modeling, demonstrating promising performance. To foster their reliable use in real-world scenarios, it is crucial to augment their transparency.