Reviews: Understanding the Effective Receptive Field in Deep Convolutional Neural Networks

Neural Information Processing Systems 

Another thing that is common in typical convolutional networks these days is batch normalization, which is also not discussed, but might help justify the extra assumptions needed to show that the main results hold using ReLU nonlinearities. In general I thought that the experiments could have been carried out in a more quantitative way. It's somewhat interesting that the receptive field is able to grow during training (but then again, maybe it must in order for the network to do better, and there is no reason why it can't grow at least somewhat).