Goto

Collaborating Authors

 plasticity


A Federated Many-to-One Hopfield model for associative Neural Networks

Alessandrelli, Andrea, Durante, Fabrizio, Ladiana, Andrea, Lepre, Andrea

arXiv.org Machine Learning

Federated learning enables collaborative training without sharing raw data, but struggles under client heterogeneity and streaming distribution shifts, where drift and novel data can impair convergence and cause forgetting. We propose a federated associative-memory framework that learns shared archetypes in heterogeneous, continual settings, where client data are independent but not necessarily balanced. Each client encodes its experience as a low-rank Hebbian operator, sent to a central server for aggregation and factorization into global archetypes. This approach preserves privacy, avoids centralized replay buffers, and is robust to small, noisy, or evolving datasets. We cast aggregation as a low-rank-plus-noise spectral inference problem, deriving theoretical thresholds for detectability and retrieval robustness. An entropy-based controller balances stability and plasticity in streaming regimes. Experiments with heterogeneous clients, drift, and novelty show improved global archetype reconstruction and associative retrieval, supporting the spectral view of federated consolidation.



SAFE TrainedModels

Neural Information Processing Systems

After calibrating in the first session, the slow efficient tuning parameters can capture more informativefeatures, improving generalization to incoming classes. Moreover, to further incorporate novel concepts, we strikeabalance between stability and plasticity byfixing slowefficient tuning parameters and continuously updating the fast ones. Specifically, a cross-classification loss with feature alignment is proposed to circumvent catastrophic forgetting.