Goto

Collaborating Authors

 mae











GeneralizedJensen-ShannonDivergenceLoss forLearningwithNoisyLabels

Neural Information Processing Systems

Based on this observation, we adopt ageneralized version ofthe JensenShannon divergence for multiple distributions to encourage consistency around data points. Using this loss function, we show state-of-the-art results on both synthetic(CIFAR),andreal-world(e.g.WebVision)noisewithvaryingnoiserates.