Revisiting Attention Weights as Interpretations of Message-Passing Neural Networks

Open in new window