What does AI know about having a ball?
In August 2020, I wrote about the stunning storytelling prowess of another LLM, GPT3 (bit.ly/3RbHfbB). The Generative Pre-trained Transformer Version 3, I wrote, was being heralded as the first step towards the holy grail of AGI (Artificial General Intelligence), where a machine has the capacity to understand or learn any intellectual task that a human being can. GPT has been trained on a massive body of text, mined for statistical regularities or parameters or connections between different nodes in its neural network. The scale is gargantuan, with 175 billion parameters; all of Wikipedia comprises just 0.6% of its training data! GPT-3 was developed by OpenAI too, and with DALL-E, it took this to another level.
Jul-8-2022, 15:24:55 GMT