Kernels and learning curves for Gaussian process regression on random graphs
Sollich, Peter, Urry, Matthew, Coti, Camille
–Neural Information Processing Systems
We investigate how well Gaussian process regression can learn functions defined on graphs, using large regular random graphs as a paradigmatic example. Random-walk based kernels are shown to have some surprising properties: within the standard approximation of a locally tree-like graph structure, the kernel does not become constant, i.e.neighbouring function values do not become fully correlated, when the lengthscale $\sigma$ of the kernel is made large. Instead the kernel attains a non-trivial limiting form, which we calculate. The fully correlated limit is reached only once loops become relevant, and we estimate where the crossover to this regime occurs. Our main subject are learning curves of Bayes error versus training set size.
Neural Information Processing Systems
Feb-15-2020, 03:27:24 GMT
- Technology: