AI trained on novels tracks how racist and sexist biases have evolved
Artificial intelligences picking up sexist and racist biases is a well-known and persistent problem, but researchers are now turning this to their advantage to analyse social attitudes through history. Training AI models on novels from a certain decade can instil them with the prejudices of that era, offering a new way to study how cultural biases have evolved over time. Large language models (LLMs) such as ChatGPT learn by analysing large collections of text.
Feb-20-2025, 13:00:35 GMT