Decomposed Knowledge Distillation for Class-Incremental Semantic Segmentation
–Neural Information Processing Systems
To this end, it is crucial to learn novel classes incrementally without forgetting previously learned knowledge. Current CISS methods typically use a knowledge distillation (KD) technique for preserving classifier logits, or freeze a feature extractor, to avoid the forgetting problem. The strong constraints, however, prevent learning discriminative features for novel classes. We introduce a CISS framework that alleviates the forgetting problem and facilitates learning novel classes effectively. We have found that a logit can be decomposed into two terms.
class-incremental semantic segmentation, decomposed knowledge distillation, name change, (8 more...)
Neural Information Processing Systems
Dec-24-2025, 02:49:20 GMT
- Technology: