Associative Recurrent Memory Transformer
Rodkin, Ivan, Kuratov, Yuri, Bulatov, Aydar, Burtsev, Mikhail
–arXiv.org Artificial Intelligence
This paper addresses the challenge of creating a neural architecture for very long sequences that requires constant time for processing new information at each time step. Our approach, Associative Recurrent Memory Transformer (ARMT), is based on transformer self-attention for local context and segment-level recurrence for storage of task specific information distributed over a long context. We demonstrate that ARMT outperfors existing alternatives in associative retrieval tasks and sets a new performance record in the recent BABILong multi-task long-context benchmark by answering single-fact questions over 50 million tokens with an accuracy of 79.9%. The source code for training and evaluation is available on github.
arXiv.org Artificial Intelligence
Jul-5-2024
- Country:
- Asia
- Middle East > Jordan (0.04)
- Russia (0.14)
- Europe
- Monaco (0.04)
- Russia > Central Federal District
- Moscow Oblast > Moscow (0.04)
- United Kingdom > England
- Greater London > London (0.04)
- North America > United States
- California > San Diego County > San Diego (0.04)
- Asia
- Genre:
- Research Report > New Finding (0.46)
- Technology: