What is GPT-3 and how will it affect your current job - MSPoweruser
GPT is short for Generative Pre-training Transformer (GPT), a language model written by Alec Radford and published in 2018 by OpenAI, Elon Musks's artificial intelligence research laboratory. It uses a generative model of language (where two neural networks perfect each other by competition) and is able to acquire knowledge of the world and process long-range dependencies by pre-training on diverse sets of written material with long stretches of contiguous text. GPT-2 (Generative Pretrained Transformer 2) was announced in February 2019 and is an unsupervised transformer language model trained on 8 million documents for a total of 40 GB of text from articles shared via Reddit submissions. Elon Musk was famously reluctant to release it as he was concerned it could be used to spam social networks with fake news. In May 2020 OpenAI announced GPT-3 (Generative Pretrained Transformer 3), a model which contains two orders of magnitude more parameters than GPT-2 (175 billion vs 1.5 billion parameters) and which offers a dramatic improvement over GPT-2.
Jul-21-2020, 11:30:27 GMT