MoME: Mixture of Multimodal Experts for Generalist Multimodal Large Language Models Gongwei Chen