Mixture-of-Transformers: A Sparse and Scalable Architecture for Multi-Modal Foundation Models

Open in new window