Could 'expiration dates' for AI systems help prevent bias?

#artificialintelligence 

Today's AI technology, much like humans, learns from examples. AI systems are developed on datasets containing text, images, audio, and other information that serve as a ground truth. By figuring out the relationships between these examples, AI systems gradually "learn" to make predictions, like which word is likely to come next in a sentence or whether objects in a picture are inanimate. The technique holds up remarkably well in the language domain, for example, where systems like OpenAI's GPT-3 can write content from essays to advertisements in human-like ways. But similar in character to humans, AI that isn't supplied fresh, new data eventually grows stale in its predictions -- a phenomenon known as model drift.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found