C2G-KD: PCA-Constrained Generator for Data-Free Knowledge Distillation
Bengtsson, Magnus, Östberg, Kenneth
–arXiv.org Artificial Intelligence
Training deep neural networks typically requires large datasets, which may not be available in privacy-sensitive or resource-constrained domains. Data-free knowledge distillation (DFKD) [1] has emerged as a promising approach where a student model learns from synthetic data generated via a pretrained teacher network. However, existing DFKD methods often fail to ensure structural alignment between synthetic and real data. We propose C2G-KD, a method leveraging PCA-derived constraints to guide a conditional generator in producing class-specific synthetic samples without direct access to real data. Central to this approach is the use of PCA [2] to impose topological constraints from minimal real samples, ensuring that generated images structurally align with class manifolds.
arXiv.org Artificial Intelligence
Jul-25-2025