In-Context Learning with Transformers: Softmax Attention Adapts to Function Lipschitzness

Neural Information Processing Systems 

We explore the role of softmax attention in an ICL setting where each context encodes a regression task. We show that an attention unit learns a window that it uses to implement a nearest-neighbors predictor adapted to the landscape of the pretraining tasks.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found