How to make artificial intelligence more democratic
This year, GPT-3, a large language model capable of understanding text, responding to questions and generating new writing examples, has drawn international media attention. The model, released by OpenAI, a California-based nonprofit that builds general-purpose artificial intelligence systems, has an impressive ability to mimic human writing, but just as notable is its massive size. To build it, researchers collected 175 billion parameters (a type of computational unit) and more than 45 terabytes of text from Common Crawl, Reddit, Wikipedia and other sources, then trained it in a process that occupied hundreds of processing units for thousands of hours. GPT-3 demonstrates a broader trend in artificial intelligence. Deep learning, which has in recent years become the dominant technique for creating new AIs, uses enormous amounts of data and computing power to fuel complex, accurate models.
Jan-5-2021, 07:24:44 GMT