SupplementaryforEmergenceofShapeBiasin ConvolutionalNeuralNetworksthroughActivation Sparsity 1 FurtherResultsoftheimpactofsparsityonShapeBiasBenchmark
–Neural Information Processing Systems
We utilize the sparsity operation proposed in Section 3.1 for ResNet-50. We generalize section 4.2 in the main text to ResNet-50 and ViT-B architectures (Figure 1). We apply the Sparsity layer in a subset of the network. It is based on the intuition that the brain utilizes sparsity for long range communication butcan allowlocal dense computation. Wedivide thenetworks into chunks where within each chunk theneuron'sactivities areallowed tobedense (keep original) but the communication across different chunks is set to be sparse.
Neural Information Processing Systems
Feb-17-2026, 15:18:51 GMT