Goto

Collaborating Authors

 append


6 SupplementaryMaterial

Neural Information Processing Systems

The original CLUTRR data generation framework made sure that each testproof is not in the training set in order to test whether a model is able to generalize to unseen proofs. Initial results on the original CLUTRR test sets resulted in strong model performance ( 99%) on levels seen during training (2, 4, 6) but no generalization at all ( 0%) to other levels. The models are given as input " [story] [query] " and asked to generate the proof and answer. Models are trained on levels2,4,6only. In our case, the entity names are important to evaluate systematic generalization.






e8258e5140317ff36c7f8225a3bf9590-Supplemental.pdf

Neural Information Processing Systems

The original MuZero did not use sticky actions (Machado et al., 2017) (a 25% chance that the selected action is ignored and that instead the previous action is repeated) for Atari experiments. For all experiments in this work we used a network architecture based on the one introduced by MuZero(Schrittwieser etal.,2020), To implement the network, we used the modules provided by the Haiku neural network library (Henniganetal.,2020). We did not observe any benefit from using a Gaussian mixture, so instead inallourexperiments weusedasingle Gaussian withdiagonal covariance. All experiments used the Adam optimiser (Kingma & Ba, 2015) with decoupled weight decay (Loshchilov & Hutter, 2017) for training.




SupplementaryMaterialsforHouseofCans: Covert TransmissionofInternalDatasetsviaCapacity-Aware NeuronSteganography

Neural Information Processing Systems

However, considering the ever-evolving paradigms in deep learning, employees with ulterior motivesmay fabricate reasons such asthe requirements ofdata augmentation [6]orthe purpose of multimodal learning [3] to apply for relevant and irrelevant private datasets, which is common in social engineering [4].