A Experimental details

Neural Information Processing Systems 

Icosahedral MNIST We use node and edge neighbourhoods with k " 1. We find the edge neighbourhood isomorphism classes and for each class, the generators of the automorphism group using software package Nauty. The MNIST digit input is a trivial feature, each subsequent feature is a vector feature of the permutation group, except for the last layer, which is again trivial. We find a basis for the kernels statisfying the kernel contstraint using SVD. The parameters linearly combine these basis kernels into the kernel used for the convolution. The trivial baseline uses trivial features throughout, with is equivalent to a simple Graph Convolutional Network. The baseline uses 6 times wider channels, to compensate for the smaller representations. We did not optimize hyperparameters and have copied the architecture from Cohen et al. [2019]. We use 6 convolutional layers with output multiplicities 8, 16, 16, 23, 23,32, 64, with stride 1 at each second layer. After each convolution, we use ...