DropBlock: A regularization method for convolutional networks
Golnaz Ghiasi, Tsung-Yi Lin, Quoc V. Le
–Neural Information Processing Systems
Deep neural networks often work well when they are over-parameterized and trained with a massive amount of noise and regularization, such as weight decay and dropout. Although dropout is widely used as a regularization technique for fully connected layers, it is often less effective for convolutional layers. This lack of success of dropout for convolutional layers is perhaps due to the fact that activation units in convolutional layers are spatially correlated so information can still flow through convolutional networks despite dropout. Thus a structured form of dropout is needed to regularize convolutional networks. In this paper, we introduce DropBlock, a form of structured dropout, where units in a contiguous region of a feature map are dropped together.
Neural Information Processing Systems
Mar-26-2025, 10:03:57 GMT