Keunseo Kim

Neural Information Processing Systems 

Supplementary Material for the Paper entitled "Locally Most Powerful Bayesian Test for Out-of-Distribution Detection Using Deep Generative Models" We present the implementation details for the V AE and Glow, used in Section 5. First, the structure of the Glow is presented in Table 1, where the "Level" refers to the number of scales that split the dimension of the latent space defined in the multi-scale architecture (Dinh et al., 2016), the "Depth per level" refers to the number of flow layers that are repeated on each scale, "In-channels hidden units" refers to the number of input channels and the number of hidden units of the neural network that defines the parameters of the flow layers, and the "Coupling" refers to the type of the coupling layers. For the training, we used the Adam optimizer with learning rate of 0.001 and batch size of 64 for both datasets. Next, Table 2 presents the structure for the V AE, which is the same as that used in Xiao et al. (2020). Batch normalization layer (BN) and activation layer by ReLU function (ReLU) were added after each convolutional layer. For training the V AE, we used the Adam optimizer with a learning rate of 0.0005, weight decay of 0.00003, and batch size of 64 for both datasets.

Similar Docs  Excel Report  more

TitleSimilaritySource
None found