Upgrade to Pro

ARSTECHNICA.COM
AI use damages professional reputation, study suggests
robot's little helper AI use damages professional reputation, study suggests New Duke study says workers judge others for AI use—and hide its use, fearing stigma. Benj Edwards – May 8, 2025 4:23 pm | 27 Credit: demaerre via Getty Images Credit: demaerre via Getty Images Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more Using AI can be a double-edged sword, according to new research from Duke University. While generative AI tools may boost productivity for some, they might also secretly damage your professional reputation. On Thursday, the Proceedings of the National Academy of Sciences (PNAS) published a study showing that employees who use AI tools like ChatGPT, Claude, and Gemini at work face negative judgments about their competence and motivation from colleagues and managers. "Our findings reveal a dilemma for people considering adopting AI tools: Although AI can enhance productivity, its use carries social costs," write researchers Jessica A. Reif, Richard P. Larrick, and Jack B. Soll of Duke's Fuqua School of Business. The Duke team conducted four experiments with over 4,400 participants to examine both anticipated and actual evaluations of AI tool users. Their findings, presented in a paper titled "Evidence of a social evaluation penalty for using AI," reveal a consistent pattern of bias against those who receive help from AI. What made this penalty particularly concerning for the researchers was its consistency across demographics. They found that the social stigma against AI use wasn't limited to specific groups. Fig. 1 from the paper "Evidence of a social evaluation penalty for using AI." Credit: Reif et al. "Testing a broad range of stimuli enabled us to examine whether the target's age, gender, or occupation qualifies the effect of receiving help from Al on these evaluations," the authors wrote in the paper. "We found that none of these target demographic attributes influences the effect of receiving Al help on perceptions of laziness, diligence, competence, independence, or self-assuredness. This suggests that the social stigmatization of AI use is not limited to its use among particular demographic groups. The result appears to be a general one." The hidden social cost of AI adoption In the first experiment conducted by the team from Duke, participants imagined using either an AI tool or a dashboard creation tool at work. It revealed that those in the AI group expected to be judged as lazier, less competent, less diligent, and more replaceable than those using conventional technology. They also reported less willingness to disclose their AI use to colleagues and managers. The second experiment confirmed these fears were justified. When evaluating descriptions of employees, participants consistently rated those receiving AI help as lazier, less competent, less diligent, less independent, and less self-assured than those receiving similar help from non-AI sources or no help at all. Fig. 3 from the paper "Evidence of a social evaluation penalty for using AI." Credit: Reif et al. The researchers discovered this bias affects real business decisions. In a hiring simulation, managers who didn't use AI themselves were less likely to hire candidates who regularly used AI tools. However, managers who frequently used AI showed the opposite preference, favoring the AI-using candidates. The final experiment revealed that perceptions of laziness directly explain this evaluation penalty. The researchers found this penalty could be offset when AI was clearly useful for the assigned task. When using AI made sense for the job, the negative perceptions diminished significantly. Notably, the study showed that evaluators' own experience with AI significantly influenced their judgments. In the study, those who used AI frequently were less likely to perceive an AI-using candidate as lazy. A complicated picture The Duke AI study also notes that similar concerns of stigma have historically accompanied other new technologies. From Plato questioning whether writing would undermine wisdom, to modern debates about calculators in education, people have long worried that labor-saving tools might reflect poorly on users' abilities. Reif and colleagues suggest this social impact may present a hidden barrier to AI adoption in workplaces. Even as organizations push AI implementation, individual employees might resist due to concerns about how they'll be perceived. That dilemma is apparently already here. In August last year, while covering ChatGPT hitting 200 million active weekly users, we mentioned that Wharton professor Ethan Mollick (who frequently researches AI) called people who use AI without telling their bosses "secret cyborgs." Because many companies ban the use of AI outputs, many workers have anecdotally turned to secret AI use. And if that doesn't complicate the picture enough, we previously covered a study that adds a different layer to AI workplace issues. Research from economists at the University of Chicago and University of Copenhagen found that while 64–90 percent of workers reported time savings from AI tools, these benefits were sometimes offset by new tasks created by the technology. The study revealed that AI tools actually generated additional work for 8.4 percent of employees, including non-users tasked with checking AI output quality or detecting AI use in student assignments. So, although using AI to do some tasks potentially saves time, employees may be creating additional work for themselves or others. In fact, the World Economic Forum's Future of Jobs Report 2025 suggested that AI may create 170 million new positions globally while eliminating 92 million jobs, resulting in a net gain of 78 million jobs by 2030. So the picture is complicated, but it appears the impact of AI on work is ongoing at a steady pace. Benj Edwards Senior AI Reporter Benj Edwards Senior AI Reporter Benj Edwards is Ars Technica's Senior AI Reporter and founder of the site's dedicated AI beat in 2022. He's also a tech historian with almost two decades of experience. In his free time, he writes and records music, collects vintage computers, and enjoys nature. He lives in Raleigh, NC. 27 Comments
·19 Views