Goto

Collaborating Authors

 global transferability


FedGTST: Boosting Global Transferability of Federated Models via Statistics Tuning

Neural Information Processing Systems

The performance of Transfer Learning (TL) significantly depends on effective pretraining, which not only requires extensive amounts of data but also substantial computational resources. As a result, in practice, it is challenging to successfully perform TL at the level of individual model developers.


FedGTST: Boosting Global Transferability of Federated Models via Statistics Tuning

Neural Information Processing Systems

The performance of Transfer Learning (TL) significantly depends on effective pretraining, which not only requires extensive amounts of data but also substantial computational resources. As a result, in practice, it is challenging to successfully perform TL at the level of individual model developers. Despite several attempts to devise effective transferable FL approaches, several important issues remain unsolved. First, existing methods in this setting primarily focus on optimizing transferability within their local client domains, thereby ignoring transferability over the global learning domain. Second, most approaches focus on analyzing indirect transferability metrics, which does not allow for accurate assessment of the final target loss and extent of transferability.