Bayesian Federated Neural Matching that Completes Full Information
–arXiv.org Artificial Intelligence
Federated learning is a contemporary machine learning paradigm where locally trained models are distilled into a global model. Due to the intrinsic permutation invariance of neural networks, Probabilistic Federated Neural Matching (PFNM) employs a Bayesian nonparametric framework in the generation process of local neurons, and then creates a linear sum assignment formulation in each alternative optimization iteration. But according to our theoretical analysis, the optimization iteration in PFNM omits global information from existing. In this study, we propose a novel approach that overcomes this flaw by introducing a Kullback-Leibler divergence penalty at each iteration. The effectiveness of our approach is demonstrated by experiments on both image classification and semantic segmentation tasks.
arXiv.org Artificial Intelligence
Feb-21-2023
- Country:
- Asia (0.46)
- North America > United States (0.46)
- Genre:
- Research Report > New Finding (0.48)
- Industry:
- Leisure & Entertainment > Sports > Soccer (0.46)
- Technology: