Energy-Efficient and Federated Meta-Learning via Projected Stochastic Gradient Ascent
Elgabli, Anis, Issaid, Chaouki Ben, Bedi, Amrit S., Bennis, Mehdi, Aggarwal, Vaneet
–arXiv.org Artificial Intelligence
In this paper, we propose an energy-efficient federated meta-learning framework. The objective is to enable learning a meta-model that can be fine-tuned to a new task with a few number of samples in a distributed setting and at low computation and communication energy consumption. We assume that each task is owned by a separate agent, so a limited number of tasks is used to train a meta-model. Assuming each task was trained offline on the agent's local data, we propose a lightweight algorithm that starts from the local models of all agents, and in a backward manner using projected stochastic gradient ascent (P-SGA) finds a meta-model. The proposed method avoids complex computations such as computing hessian, double looping, and matrix inversion, while achieving high performance at significantly less energy consumption compared to the state-of-the-art methods such as MAML and iMAML on conducted experiments for sinusoid regression and image classification tasks.
arXiv.org Artificial Intelligence
May-31-2021
- Country:
- Europe > Finland (0.14)
- North America > United States (0.14)
- Genre:
- Research Report > New Finding (0.68)
- Technology: