Meta-Query-Net: Resolving Purity-Informativeness Dilemma in Open-set Active Learning (Supplementary Material) A Complete Proof of Theorem 4.1

Neural Information Processing Systems 

We prove Theorem 4.1 by mathematical induction, as follows: (1) the first layer's output satisfies (k 1) Consider each dimension's scalar output of By the composition rule of the non-decreasing function, applying any non-decreasing function does not change the order of its inputs. By mathematical induction, where Lemmas A.1 and A.2 constitute the base step, and Lemma A.3 is the inductive step, any non-negative-weighted MLP satisfies the skyline constraint. We train ResNet-18 using SGD with a momentum of 0.9 and a weight decay of 0.0005, and a batch size of 64. In the setup of open-set AL, the number of IN examples for training differs depending on the query strategy. We train MQ-Net for 100 epochs using SGD with a weight decay of 0.0005, and a mini-batch size of 64. Figure 5 shows the test accuracy of the target model throughout AL rounds on the three cross-datasets.