Rethinking Softmax: Self-Attention with Polynomial Activations

Open in new window