Rethinking Memory and Communication Costs for Efficient Data Parallel Training of Large Language Models

Open in new window