WWW.NEOWIN.NET
Science proves some Reddit users just love to troll, disagree and argue
When you purchase through links on our site, we may earn an affiliate commission. Here’s how it works.
Science proves some Reddit users just love to troll, disagree and argue
Sayan Sen
Neowin
·
May 12, 2025 10:12 EDT
Online communities shape discussions, influence opinions, collective behaviour and even impact real-world decisions. But spotting harmful users, like trolls and people who spread misinformation, isn’t easy. Traditional methods usually focus on what people say or who they’re connected to, but these approaches have their limits. New research suggests a smarter way: identifying users based on how they behave rather than just their words.
At the ACM Web Conference, researchers introduced a technique that uses Inverse Reinforcement Learning (IRL)—a tool commonly used in fields like self-driving cars and game theory—to analyze how people interact online. This method makes it possible to track behavioral patterns and understand how users contribute to online discussions, rather than just looking at the content they post.
For their study, researchers examined 5.9 million Reddit interactions over six years and found five different types of users based on their behavior. One interesting group stood out—the "disagreers." These users actively seek out conversations just to argue. Instead of discussing topics constructively, they jump into debates, post opposing views, and then leave without waiting for responses. These kinds of users were most common in political subreddits like r/news, r/politics, and r/worldnews. But surprisingly, they weren’t as common in the now-banned r/The_Donald, a pro-Trump subreddit. In that forum, people mostly agreed with each other but showed hostility toward outsiders.
The study also looked at homophily, which is the tendency of people to connect with others who share similar views. This tendency creates echo chambers, where like-minded users reinforce each other’s opinions and deepen division. Normally, researchers measure homophily by looking at content (what people talk about) or social networks (who they interact with), but those methods don’t work well on platforms like Reddit, where users interact based on topics rather than friendships.
The researchers came up with a new way to measure homophily using behavior. Instead of just studying topics, they used IRL to analyze patterns in user activity. This revealed surprising connections, like the fact that people discussing soccer (r/soccer) and e-sports (r/leagueoflegends) behave almost the same way, even though they’re talking about completely different subjects. Fans in both communities strongly support their teams, follow matches closely, debate strategies, and critique rivals. This challenges the idea that online polarization comes from echo chambers based only on topics. Instead, it suggests how people interact matters just as much—if not more—than what they talk about.
The findings could be very useful for social media platforms. Unlike traditional moderation methods, which focus on detecting harmful content, this behavioral approach is harder to trick or avoid. Changing a user’s wording is easy, but changing how they interact takes much more effort. Moderators could use this research to identify problematic users early, even before they start posting lots of harmful content.
Ultimately, the study highlights an important lesson: What we say online matters, but how we interact shapes the digital world even more. As social platforms struggle with harassment, misinformation, and polarization, using behavioral analysis alongside traditional content moderation might be the key to creating healthier, more constructive online communities.
Source: The Conversation, ACM Digital Library
This article was generated with some help from AI and reviewed by an editor.
Tags
Report a problem with article
Follow @NeowinFeed