Fovea Transformer: Efficient Long-Context Modeling with Structured Fine-to-Coarse Attention

Open in new window