Naturalness of Attention: Revisiting Attention in Code Language Models

Open in new window