Decomposed Knowledge Distillation for Class-Incremental Semantic Segmentation
–Neural Information Processing Systems
To this end, it is crucial to learn novel classes incrementally without forgetting previously learned knowledge. Current CISS methods typically use a knowledge distillation (KD) technique for preserving classifier logits, or freeze a feature extractor, to avoid the forgetting problem. The strong constraints, however, prevent learning discriminative features for novel classes. We introduce a CISS framework that alleviates the forgetting problem and facilitates learning novel classes effectively. We have found that a logit can be decomposed into two terms.
Neural Information Processing Systems
Oct-10-2024, 20:22:21 GMT
- Technology: