Bayesian Attention Modules: Appendix A Algorithm Algorithm 1: Bayesian Attention Modules

Neural Information Processing Systems 

We follow the same architectural hyperparameters as in V eli ˇ ckovi c et al. We adopt hypothesis testing to quantify the uncertainty of a model's prediction. Acc (ans) = min{ (#human that said ans)/ 3, 1}. By stacking MCA layers, MCAN enables deep interactions between the question and image features. We conduct experiments on an attention-based model for image captioning, Att2in, in Rennie et al.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found