A Implementation Details
–Neural Information Processing Systems
A.1 CIFAR-10 ResNet-29 For all the experimental results on ResNet-29 v2 (He et al., 2016b), we use a batch size of 256. The network is trained with Adam optimizer (Kingma et al., 2015) for 200 epochs. We adapted the following data augmentation and training script at https://keras.io/examples/cifar10_resnet/. The training mechanism is the same for all the methods that we compare in the main paper. We randomly split the training dataset into training data of 45000 images and 5000 images as the validation set.
Neural Information Processing Systems
May-29-2025, 05:41:44 GMT
- Technology: