Easy Batch Normalization
Asadulaev, Arip, Panfilov, Alexander, Filchenkov, Andrey
–arXiv.org Artificial Intelligence
It was shown that adversarial examples improve object recognition. But what about their opposite side, easy examples? Easy examples are samples that the machine learning model classifies correctly with high confidence. In our paper, we are making the first step toward exploring the potential benefits of using easy examples in the training procedure of neural networks. We propose to use an auxiliary batch normalization for easy examples for the standard and robust accuracy improvement.
arXiv.org Artificial Intelligence
Jul-18-2022
- Country:
- Asia
- Middle East > UAE
- Abu Dhabi Emirate > Abu Dhabi (0.04)
- Russia (0.05)
- Middle East > UAE
- Europe
- North America
- Canada
- United States
- Louisiana > Orleans Parish
- New Orleans (0.04)
- Maryland > Baltimore (0.04)
- Nevada > Clark County
- Las Vegas (0.04)
- Washington > King County
- Seattle (0.04)
- Louisiana > Orleans Parish
- Oceania > Australia
- New South Wales > Sydney (0.04)
- Asia
- Genre:
- Research Report (0.84)
- Technology: