Reviews: Cross-channel Communication Networks

Neural Information Processing Systems 

The authors propose an approach to increase the representation power of neural network by introducing communication between the neurons in the same layer. To this end a neural communication bloc is introduced. It first encodes the feature map of each neuron to reduce its dimensionality by a factor of 8. Then an attention-based GCN is used to propagate the information between the neurons via a fully-connected graph. In practice, a weighted sum of the neuron encodings is computed for each node, where the weights are determined by the nodes' features similarity. Finally, the updated representation is decoded to the original resolution and added to the original features. Importantly, this model applies the same operations to every neuron, thus the number of parameters is independent of the feature dimensionality, but dependent on the spatial size of the feature map.