Goto

Collaborating Authors

 inourexperiment



DropBlock: A regularization method for convolutional networks

Golnaz Ghiasi, Tsung-Yi Lin, Quoc V. Le

Neural Information Processing Systems

Deep neural networks often work well when they are over-parameterized and trained withamassiveamount ofnoiseandregularization, suchasweight decay and dropout. Although dropout is widely used as a regularization technique for fully connected layers, it is often less effective for convolutional layers.




b4edda67f0f57e218a8e766927e3e5c5-Paper.pdf

Neural Information Processing Systems

The priors used in this presentation include variants oftotalvariation, Laplacian regularization, bilateral filtering, sparse coding on learned dictionaries, and non-local self similarities. Our models are fully interpretable as well as parameter and data efficient.


SupplementaryMaterials

Neural Information Processing Systems

These parameters are inherited from SA layers. During inference, the pointsofthe entire scene are takenasthe input. The result shows that the optimal choice ofG grows in linear relationship withC. Inthis experiment, we choose "C 3,G = 12asthe default setting. On the benchmark ofSUN-RGBD, we found that the performance ofFP2 layer isless satisfying thanFP1layer.



REVIVE: RegionalVisualRepresentationMattersin Knowledge-BasedVisualQuestionAnswering

Neural Information Processing Systems

This paper revisits visual representation in knowledge-based visual question answering(VQA)anddemonstrates thatusingregionalinformation inabetterway can significantly improve the performance. While visual representation is extensively studied in traditional VQA, it is under-explored in knowledge-based VQA even though these two tasks share the common spirit, i.e., rely on visual inputtoanswerthequestion.