Implicit SVD for Graph Representation Learning
–Neural Information Processing Systems
Recent improvements in the performance of state-of-the-art (SOTA) methods for Graph Representational Learning (GRL) have come at the cost of significant computational resource requirements for training, e.g., for calculating gradients via backprop over many data epochs. Meanwhile, Singular Value Decomposition (SVD) can find closed-form solutions to convex problems, using merely a handful of epochs. In this paper, we make GRL more computationally tractable for those with modest hardware. We design a framework that computes SVD of *implicitly* defined matrices, and apply this framework to several GRL tasks. For each task, we derive first-order approximation of a SOTA model, where we design (expensive-to-store) matrix \mathbf{M} and train the model, in closed-form, via SVD of \mathbf{M}, without calculating entries of \mathbf{M} .
Neural Information Processing Systems
Oct-10-2024, 05:31:42 GMT
- Technology: