Five Key Facts About Wu Dao 2.0: The Largest Transformer Model Ever Built - KDnuggets
I recently started a new newsletter focus on AI education and already has over 50,000 subscribers. TheSequence is a no-BS( meaning no hype, no news etc) AI-focused newsletter that takes 5 minutes to read. The goal is to keep you up to date with machine learning projects, research papers and concepts. It seems that every other month we have a new milestone in the race of building massively large transformer models. GPT-2 set up new records by building a 1.5 billion parameters model just to be surpassed by Microsoft's Turing NLG with 17 billion parameters.
Sep-6-2021, 12:05:16 GMT
- Technology: