New Research Suggests Overreliance on AI Could Hinders Critical Thinking
www.cnet.com
Overreliance on AI systems may hinder our critical thinking potential as people offload synthesis to machines, according to researchers at Microsoft and Carnegie Mellon University. The study is set to be presented at the CHI conference on Human Factors in Computing Systems in Yokohama, Japan, in April.In the study, researchers define critical thinking as a hierarchical pyramid, with knowledge at the top, followed by an understanding of ideas, putting ideas into practice, analyzing against related ideas, synthesizing or combining those ideas and evaluating ideas through a set criteria.Based on surveys done of 319 knowledge workers, which can generally be categorized as white collar jobs, the study found that while generative AI can improve efficiency, "it can inhibit critical engagement with work and can potentially lead to long-term overreliance on the tool and diminished skill for independent problem-solving."Researchers found that workers like to use AI to double-check their work and do so by comparing it against other external sources to meet a certain criteria. While this certainly requires critical analysis, researchers note that workers' use of AI to automate routine or lower-stakes tasks raises concerns about "long-term reliance and diminished independent problem-solving."Interestingly, when workers have higher confidence in AI responses, it "appear[s] to reduce the perceived effort required for critical thinking tasks." However, workers who really believe in their own expertise end up putting greater effort in evaluating AI responses. So while AI can help workers retrieve information faster, they may end up spending more time trying to verify all that information as accurate and not a hallucination."As workers shift from task execution to AI oversight, they trade hands-on engagement for the challenge of verifying and editing AI outputs, revealing both the efficiency gains and the risks of diminished critical reflection," the study said.Researchers don't want people to come to definitive conclusions about AI use and weakened critical thinking, however. They admit that correlations don't indicate causation. It's impossible to see inside the human mind and know exactly what thoughts are bouncing around when a person reads an AI-generated answer.Still, the data did lead to some recommendations from the researchers. The study says as workers shift from information gathering tasks to more information verification, they should be trained on the importance of cross-referencing AI outputs and assessing their relevancy.The study comes as AI is proliferating across all sectors, with particularly major effects for businesses, which could see a 41% reduction in workforce, according to a survey by the World Economic Forum. Already, Big Tech CEOs admit that they've been offloading more tasks to AI, leading to layoffs and fewer job opportunities. The CEO of Klarna told the BBC he's already reduced his workforce from 5,000 to 3,800 and plans to bring it down even further to 2,000, but admits that remaining employees will get paid more.A series of AI safety-related executive orders by former President Joe Biden wereoverturned by President Donald Trump, giving Big Tech fewer guardrails. Last week,Google lifted its ban on AI being used for the development of weapons and surveillance tools. All of these changes make the results of this study more relevant as workers get access to more AI tools and are tasked with overseeing more AI-generated information.Researchers did point out that with any new technological innovation, concerns of declining human cognition are commonplace. For example, they note that Socrates objected to writing, Trithemius objected to printing and educators have long been wary of calculator and internet usage.But they also point out, "A key irony of automation is that by mechanising routine tasks and leaving exception-handling to the human user, you deprive the user of the routine opportunities to practice their judgement and strengthen their cognitive musculature, leaving them atrophied and unprepared when the exceptions do arise."
0 Kommentare ·0 Anteile ·45 Ansichten