GeneralizedJensen-ShannonDivergenceLoss forLearningwithNoisyLabels
–Neural Information Processing Systems
Based on this observation, we adopt ageneralized version ofthe JensenShannon divergence for multiple distributions to encourage consistency around data points. Using this loss function, we show state-of-the-art results on both synthetic(CIFAR),andreal-world(e.g.WebVision)noisewithvaryingnoiserates.
Neural Information Processing Systems
Feb-12-2026, 01:46:37 GMT
- Technology: