Обновить до Про

TECHREPORT.COM
Cheating Ourselves: When AI Replaces Thinking Instead of Supporting It
Home Cheating Ourselves: When AI Replaces Thinking Instead of Supporting It News Cheating Ourselves: When AI Replaces Thinking Instead of Supporting It 8 min read Published: May 7, 2025 Key Takeaways Research shows that the increasing use of LLM tools like ChatGPT has led to reduced critical thinking and higher cognitive offloading. Students using these tools fail to understand the reasoning behind a problem and end up scoring less than students who do not use any such tools. AI platforms like Cluely encourage people to ‘stop thinking’ and instead cheat on everything, including exams and jobs. There’s a need for smarter AI tools like Khanmigo that aid the thinking process instead of replacing it. AI in education and teaching has been a debated topic for quite a few years now. While some advocate actively for AI learning tech, only a few can see the slow death of human intellect and critical thinking abilities in the hands of AI. In this article, we’ll unwrap the effects of artificial intelligence on human thinking and cognitive offloading. Sit tight because this could be a brain-melting one. Why Think When You Can Prompt? Let’s start with a simple example of a college assignment. Before AI wasn’t more than just an odd combination of two English alphabets, you’d actually have to dig in the internet for information, read through tons of stuff, and then compile the whole assignment together. This required skills like research, summarizing, understanding, thinking, and even rephrasing. Cut to now, where you can just type in your assignment details on ChatGPT and let it churn out content while you sit back and scroll through Instagram or TikTok. You have a computer-generated, human-like assignment ready within a matter of minutes. The need to think, learn, and compile has gone for a toss, leaving no room for critical thought or understanding. During our research for this article, we found ourselves genuinely wondering why students are resorting to such shortcuts. Well, one possible reason could be that education today is largely viewed as a set of tasks you need to complete to get a certificate. That’s about it. The ‘willingness’ to learn new things seems to be dying among new-age students. People are submitting LLM-generated assignments simply because they think the very act of writing up a project is not important or value-adding to their educational pursuit. The Changing Education Landscape Research at the University of Pennsylvania titled ‘Generative AI Can Harm Learning’ throws more light on the matter. The experiment divided children in a Turkish high school into two categories: one with access to ChatGPT and another without. Students who used ChatGPT solved 48% more mathematical problems correctly during practice. However, when a test was conducted on the same topic, students who had no access to any LLM tool scored better. The children who had used ChatGPT while solving their practice problems ended up scoring 17% less. What does this teach us? ChatGPT or any other LLM model can help you just ‘complete a task.’ However, it adds little value to your overall intellect and doesn’t contribute much to the learning spectrum. In other words, students who used ChatGPT simply copy-pasted the answers provided by the tool without understanding the process behind the solution. Some argue that we already have technology like spreadsheets and calculators that automate such ‘mundane’ tasks. Why don’t we eliminate those and do stuff manually for the sake of ‘learning’? Well, a key differential is that you still need to understand the formulas being used on a spreadsheet to be able to produce any usable output. It’s you who runs a spreadsheet with different formulas. A spreadsheet or a calculator does not alter the way we think; these tools still require us to use our intellect and our critical decision-making thought process. However, AI tools are completely different. Instead of helping us think, they end up thinking on our behalf. The very essence of being human is the ability to process information and ponder upon things from a critical point of view. At the risk of sounding overdramatic, AI could be the death of human thinking. Increasing Cognitive Offloading: Outsourcing Thinking Another study titled ‘AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking’ studied the correlation between cognitive offloading and the use of AI tools. Cognitive offloading is the practice of reducing mental work by offloading the task to external tools, which in this case are AI apps. The research shows that younger participants (17-25) exhibit the highest use of AI tools and also the lowest critical thinking scores. Cognitive offloading was also the highest among this age group. In contrast, older participants (46 and above) exhibited low AI tool usage and higher critical thinking scores. It’s important to understand here that even the younger participants have gotten their hands on AI tools only in the last couple of years. They haven’t had the access to these tools throughout their childhood or early adolescence, unlike the present generation. Even then, we are seeing low critical thinking and high cognitive offloading among this age group. We can extrapolate these results to the current batch of students who rely entirely on these AI tools for education. It might be possible that if the research is done five years down the line again, these students might get even lower critical thinking scores than the current batch. The same study shows a strong negative correlation between the use of AI tools and critical thinking. This means that the more people use these tools, the lower their critical thinking skills and scores are. Plus, there was a strong positive correlation between AI tool use and cognitive offloading. So, the more people use AI tools, the more they tend to throw away their ability to think critically. AI Is Making Cheating the New Fad Enter Cluely. It’s an advanced AI assistant with a tagline that reads, ‘Let’s Cheat on Everything.’ Unfortunately, that’s what it’s literally meant for – cheating the human intellect. What’s more concerning is Cluely’s ad campaign, which shows two people on a date. The guy is wearing some sort of smart glasses, which tell him what to say next. It listens to responses from the girl and curates the ‘perfect reply.’ The guy can be seen lying about his age, job, and preferences in the video. What’s more shocking is how blatantly the brand says, ‘We built Cluley so you never have to think alone again.’ While this may sound fun to some people, it’s pretty bone-chilling to think where such tech might eventually take us. If the death of critical thinking and problem-solving wasn’t scary enough, there’s also the chance of AI killing our ability to have genuine human connection and conversation. Algorithmic LLMs like Cluely do not want to improve the human experience but rather replace it. Eventually, it’d just be two computer programs dating each other in human skin. Ana de Armas from Blade Runner 2049 is depressing, if you really think about it. The scarier part is that Cluely is not a backyard school project. It’s a real AI tool that has raised $5.3 million in seed funding. AI: A Double-Edged Sword Not everything is as grim as it looks. AI and education have surely changed the way young minds learn. Earlier, we had to dig in several pages of a textbook to understand complex equations and theorems. Now, human-like AI tools have been developed that gamify the whole learning experience and break down complex topics into small, actionable lessons. Sal Khan, the founder of Khan Academy, showed us the positive sides of AI in education two years back in a TED Talk when he unveiled Khanmigo, an AI learning assistant. Unlike LLMs that blurt out the answers to any question, Khanmigo gives students a hint about the math problem and encourages them to solve it themselves, still allowing them to think on their own. A lot of our concern about AI in education is that students won’t learn how to write or think. Khanmigo solves this problem to an extent. Instead of writing ‘for’ you, this AI assistant ‘writes with you.’ For example, if you want to write a story, it will direct you to write the first two lines, after which it will add another two lines and repeat the process until you have the full story. Khanmigo doesn’t kill creativity but rather stimulates it. This is a perfect example of how AI should be used in education. Rather than using it to offload our cognitive thinking, it should be used as a stimulus to human thought and the learning process as a whole. The bigger question is, whose responsibility is it to ensure learning stays a human experience? To put it bluntly, companies are not going to stop producing LLM tools like ChatGPT, Gemini, or Claude simply because they’re stifling human thinking. There’s money to be made, so goodbye moral responsibility. Therefore, a large part of the responsibility falls on teachers, parents, and students themselves. Guardians should regulate the kind of AI tools a child has access to. At the same time, students should become more self-aware of the long-term harm caused by replacing their thought process with an AI prompt. While tools like Khanmigo can become effective in the learning process, ‘feed me the answer’ kind of AI tools can kill human creativity and critical thinking. The choice is yours. Krishi is a seasoned tech journalist with over four years of experience writing about PC hardware, consumer technology, and artificial intelligence.  Clarity and accessibility are at the core of Krishi’s writing style. He believes technology writing should empower readers—not confuse them—and he’s committed to ensuring his content is always easy to understand without sacrificing accuracy or depth. Over the years, Krishi has contributed to some of the most reputable names in the industry, including Techopedia, TechRadar, and Tom’s Guide.  A man of many talents, Krishi has also proven his mettle as a crypto writer, tackling complex topics with both ease and zeal. His work spans various formats—from in-depth explainers and news coverage to feature pieces and buying guides.  Behind the scenes, Krishi operates from a dual-monitor setup (including a 29-inch LG UltraWide) that’s always buzzing with news feeds, technical documentation, and research notes, as well as the occasional gaming sessions that keep him fresh.  Krishi thrives on staying current, always ready to dive into the latest announcements, industry shifts, and their far-reaching impacts.  When he's not deep into research on the latest PC hardware news, Krishi would love to chat with you about day trading and the financial markets—oh! And cricket, as well. View all articles by Krishi Chowdhary Our editorial process The Tech Report editorial policy is centered on providing helpful, accurate content that offers real value to our readers. We only work with experienced writers who have specific knowledge in the topics they cover, including latest developments in technology, online privacy, cryptocurrencies, software, and more. Our editorial policy ensures that each topic is researched and curated by our in-house editors. We maintain rigorous journalistic standards, and every article is 100% written by real authors. More from News View all View all
·50 Просмотры