Privacy-preserving quantum federated learning via gradient hiding

Li, Changhao, Kumar, Niraj, Song, Zhixin, Chakrabarti, Shouvanik, Pistoia, Marco

arXiv.org Artificial Intelligence 

To this end, quantum technologies could distributed quantum computing including quantum provide a natural embedding of privacy. To counteract machine learning (QML) [1-9], has garnered considerable the gradient inversion attack, one recent proposal [9] replaced attention due to its remarkable capability to the classical neural network in the FL model with harness the collective power of distributed quantum resources, variational quantum circuits built using expressive quantum surpassing the limitations of individual quantum feature maps such that the problem of a successful nodes. Distributed quantum computation usually involves attack is reduced to solving high-degree multivariate generating and transmitting quantum states across Chebyshev equations. Other quantum-based proposals multiple nodes leveraging the advancements in quantum include adding a certain level of noise to the gradient communication technologies [10]. Remarkably, distributed values to reduce the probability of a successful gradient quantum computing protocols offer a ray of hope inversion attack [24], leveraging blind quantum computing in addressing privacy concerns in the presence of adversaries [25], and others [26-29]. An alternative to the [10-14], while traditional classical methods have aforementioned methods is to encode the client's classical struggled to ensure the confidentiality of sensitive information gradient values into quantum states and leverage during distributed processes. These adversaries quantum communication between the clients and server not only involve third-party attacks that can be tackled to transmit the states. This provides opportunities to with well-celebrated quantum communication technologies hide the gradient values of individual clients from the such as quantum key distribution [10, 11], but also server while allowing the server to perform the model aggregation include privacy concerns with untrusted computing nodes using appropriate quantum operations on their [12, 13].