FedGTEA: Federated Class-Incremental Learning with Gaussian Task Embedding and Alignment
We introduce a novel framework for Federated Class Incremental Learning, called Federated Gaussian Task Embedding and Alignment (FedGTEA). FedGTEA is designed to capture task-specific knowledge and model uncertainty in a scalable and communication-efficient manner. At the client side, the Cardinality-Agnostic Task Encoder (CATE) produces Gaussian-distributed task embed-dings that encode task knowledge, address statistical heterogeneity, and quantify data uncertainty. Importantly, CATE maintains a fixed parameter size regardless of the number of tasks, which ensures scalability across long task sequences. On the server side, FedGTEA utilizes the 2-Wasserstein distance to measure inter-task gaps between Gaussian embeddings. We formulate the Wasserstein loss to enforce inter-task separation. This probabilistic formulation not only enhances representation learning but also preserves task-level privacy by avoiding the direct transmission of latent embed-dings, aligning with the privacy constraints in federated learning. Extensive empirical evaluations on popular datasets demonstrate that FedGTEA achieves superior classification performance and significantly mitigates forgetting, consistently outperforming strong existing baselines.
Oct-16-2025
- Country:
- Africa > Middle East
- Morocco > Tanger-Tetouan-Al Hoceima Region > Tangier (0.04)
- Asia > South Korea
- Europe > France
- Île-de-France > Paris > Paris (0.04)
- North America
- Canada > British Columbia
- United States
- Hawaii > Honolulu County
- Honolulu (0.04)
- Louisiana > Orleans Parish
- New Orleans (0.04)
- Texas > Dallas County
- Dallas (0.04)
- Virginia (0.04)
- Hawaii > Honolulu County
- Africa > Middle East
- Genre:
- Overview (0.46)
- Research Report (0.50)
- Industry:
- Information Technology > Security & Privacy (1.00)
- Technology: