How AI Is Being Transformed by 'Foundation Models'
In the world of computer science and artificial intelligence, few topics are generating as much interest as the rise of so-called "foundation models." These models can be thought of as meta-AI--but not Meta-AI, if you see what I mean--systems that incorporate vast neural networks with even bigger datasets. They are able to process a lot but, more importantly, they are easily adaptable across information domain areas, shortening and simplifying what has previously been a laborious process of training AI systems. If foundation models fulfill their promise, it could bring AI into much broader commercial use. To give a sense of the scale of these algorithms, GPT-3, a foundation model for natural language processing released two years ago, contains upwards of 170 billion parameters, the variables that guide functions within a model.
May-4-2022, 01:40:12 GMT