Buffer Overflow in Mixture of Experts
Hayes, Jamie, Shumailov, Ilia, Yona, Itay
–arXiv.org Artificial Intelligence
Mixture of Experts (MoE) has become a key ingredient for scaling large foundation models while keeping inference costs steady. We show that expert routing strategies that have cross-batch dependencies are vulnerable to attacks. Malicious queries can be sent to a model and can affect a model's output on other benign queries if they are grouped in the same batch. We demonstrate this via a proof-of-concept attack in a toy experimental setting. In the early days of computing, time-sharing gave a major boost to user productivity and hardware utilisation.
arXiv.org Artificial Intelligence
Feb-8-2024