Analysis of Dirichlet Energies as Over-smoothing Measures
Bison, Anna, Sperduti, Alessandro
–arXiv.org Artificial Intelligence
One of the most analyzed problems in Graph Neural Networks (GNNs) is over-smoothing, that is usually described as the exponential convergence of node embeddings to a common vector through the GNN layers. One of the more frequently used metrics to analyze both theoretically and empirically over-smoothing is the Dirichlet energy, that is induced by the graph Laplacian, with different possibilities as analyzed in the next section. A formal axiomatic definition of over-smoothing, based on the definition of a "total over-smoothing" state where all node embeddings are identical, has been proposed in [1]. A key axiom of the proposal is that a smoothness measure should be zero if and only if this state is reached. In this paper, we point out that the widely-used Dirichlet Energy induced by the normalized graph Laplacian does not satisfy this axiom. Recently, some other issues in adopting Dirichlet energies in order to measure over-smoothing were pointed out in [2], where it is explained that Dirichlet energy induced by the normalized Laplacian tends to zero when node embeddings tend to its dominant eigenvector v s.t.
arXiv.org Artificial Intelligence
Dec-11-2025