SWARM Parallelism: Training Large Models Can Be Surprisingly Communication-Efficient

Open in new window