Flexible and Effective Mixing of Large Language Models into a Mixture of Domain Experts

Lee, Rhui Dih, Wynter, Laura, Ganti, Raghu Kiran

arXiv.org Artificial Intelligence 

We present a toolkit for creating low-cost Mixture-of-Domain-Experts (MOE) from trained models. The toolkit can be used for creating a mixture from models or from adapters. We perform extensive tests and offer guidance on defining the architecture of the resulting MOE using the toolkit. A public repository is available.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found