Tech Use Associated with Reduced Dementia Risk in Older Adults
April 15, 20253 min readTech Use Isn’t Driving Dementia in Older AdultsSmartphone and computer use hasn’t put today’s older adults at increased risk of cognitive declineBy Payal Dhar edited by Allison Parshall Halfpoint Images/Getty ImagesScreens are steadily taking over more and more of our life, leading some researchers to worry about the effect of long-term use on older adults’ brain. It has been suggested that this might lead to so-called digital dementia—that depending on digital technology throughout our life might detract from cognitive functioning in our later years.But new research indicates this hypothesis doesn’t appear to be true, at least for the generation of adults who first routinely used smartphones, computers and the Internet—and who are now reaching the age when cognitive impairment often starts to appear. In a paper published on Monday in Nature Human Behaviour, researchers analyzed 57 studies of digital technology use and cognitive function in more than 400,000 older adults across the world. They found that people in the “digital pioneer” generation who engaged more with digital technology did not have higher rates of cognitive impairment. In fact, technology use was associated with lower rates of cognitive decline. Yet the nature of the apparent relationship between these factors remains unclear.One reason some researchers worried about “digital dementia” was previous research that linked television viewing, a passive activity, with negative outcomes such as an increased risk of Alzheimer’s disease. On the surface, most computer or smartphone screentime might seem similarly passive, says the new paper’s co-author Michael Scullin, a neuroscience researcher at Baylor University. But “for this group of middle-aged and older adults, [phones and computers] have been used for mentally stimulating activities and for social connections,” such as solving puzzles, engaging with the news, chatting with friends, or using reminders and alarms to help with daily activities.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Scullin and his co-author Jared Benge, a neuropsychologist at the University of Texas at Austin, gathered large studies that tracked cognitive and other health outcomes—and that also included information on participants’ everyday technology use. The authors identified 57 studies and rated them for the quality of their evidence, based on sample size, methods, and more. In analyzing the studies’ results, the researchers found technology use pointed to a reduced risk of cognitive decline. This outcome could not be solely explained by demographic, socioeconomic, health or other lifestyle factors in isolation.A possibility that using digital technology could potentially help stave off cognitive decline aligns with the cognitive reserve theory, or the idea that the brain can work around damage from neurodegenerative diseases by finding alternate ways to complete tasks. This theory, based on a 1988 study in which individuals with no dementia symptoms were found to have Alzheimer’s-like changes in their brain, holds that engaging with complex mental activities may lead to better cognitive well-being in older age.But it’s also possible that the reverse is true: instead of technology use staving off dementia, early experiences of cognitive decline could decrease older adults’ likelihood of engaging with technology in the first place. Or an unknown third variable could be affecting both technology use and cognitive decline. “Correlation is not causation,” Benge stresses.“I was not especially surprised by the results, but I was surprised by how clear and consistent they were,” says neuroscientist Sam Gilbert of University College London, who wasn’t involved with the study. “This study provides a robust challenge to the pervasive fears about ‘digital dementia’ and highlights the potential of technology as a tool for cognitive enrichment,” says Chiara Scarampi, a neuroscientist at the University of Geneva, who also was not involved with the study.“‘Digital dementia’ has always felt like a catchy but overstated concept to me,” she adds. “Cognitive offloading—using tools like reminders or GPS—is not inherently harmful. In fact, it can free up cognitive resources for more complex tasks.”