AI trained on novels tracks how racist and sexist biases have evolved
www.newscientist.com
TechnologyQuestioning a chatbot that has been trained on bestselling books from a particular decade can give researchers a measure of the social biases of that era 20 February 2025 Books can document the cultural biases of the era when they were publishedAnn Taylor/AlamyArtificial intelligences picking up sexist and racist biases is a well-known and persistent problem, but researchers are now turning this to their advantage to analyse social attitudes through history. Training AI models on novels from a certain decade can instil them with the prejudices of that era, offering a new way to study how cultural biases have evolved over time.Large language models (LLMs) such as ChatGPT learn by analysing large collections of text. They tend to inherit the biases found within their training data:
0 Commenti ·0 condivisioni ·56 Views