WWW.PSYPOST.ORG
Study finds 75% of Facebook shares are made without reading the content
(Photo credit: Adobe Stock)A new study has found that most social media users share links without clicking on them first, relying only on headlines and short summaries. The analysis, which examined over 35 million public Facebook posts, found that around 75% of shares occurred without users engaging with the full content. Notably, political contentespecially from both extremes of the ideological spectrumwas more likely to be shared without being clicked than neutral content. The findings have been published in Nature Human Behavior.The researchers aimed to understand how and why people share content on social media without reading it first. Social media platforms thrive on sharing, a behavior that drives engagement and allows content to go viral. However, the ease and speed of sharing mean users often act impulsively, spreading links based on superficial cues like headlines or the number of likes. This behavior can inadvertently contribute to the dissemination of misinformation, particularly in the political sphere. Previous research has suggested that people often form opinions from short snippets, creating an illusion of knowledge without truly understanding the content.The inspiration for our research is in understanding the phenomenon of sharing, which in my mind is the single most influential action on social media. Not only does sharing result in the multiplicative effect of information spreading through networks of individuals, it has in recent years fueled the epidemic of online misinformation, said corresponding author S. Shyam Sundar, Evan Pugh University Professor and the James P. Jimirro Professor of Media Effects at Penn State.I have been interested in fellow online users acting as de facto communication sources ever since my dissertation back in 1995. With sharing features of social media, the ability of ordinary people to serve as sources of news and public affairs information has dramatically increased. What most people do not realize is that their friends and family on social media do not have the journalistic training to vet facts and double-check them before disseminating. We tend to be swayed by whatever they share.In my lab group, we have long been interested in studying how deeply online users process information, how much thought they put into what they read and forward on social media and mobile phones, Sundar told PsyPost. So, when the opportunity arose to study sharing on a large scale with the URL Shares dataset released by Meta, which is the largest social science dataset ever assembled, we were obviously interested in exploring the sheer the volume of the phenomenon of sharing without clicking, which is an indicator of the superficiality of information processing.To investigate this phenomenon, the team analyzed a massive dataset provided by Facebooks collaboration with Social Science One. The dataset included billions of interactions with over 35 million URLs shared on Facebook from 2017 to 2020. The team focused on the top 4,617 domains (such as CNN, Fox News, and The New York Times) and 35 million URLs shared on Facebook during this four-year period.The researchers examined two main areas: the frequency of shares without clicks and the patterns of political content sharing. The data were separated into political and non-political content using a machine learning classifier trained to identify politically relevant keywords. Political content included URLs tied to elections, candidates, and other partisan topics, while non-political content ranged from entertainment to general news.The team analyzed users sharing behaviors across different political leaningsliberal, neutral, and conservativeand examined whether users ideological alignment with the content influenced their likelihood of sharing it without clicking. They also looked specifically at fact-checked URLs to identify patterns in the spread of misinformation.Across all 35 million URLs analyzed, approximately 75% of shares occurred without the users clicking on the link to view its full content. This trend was even stronger for political content, particularly at the ideological extremes. The spread of misinformation was particularly concerning. Fact-checked URLs identified as false were more likely to be shared without being clicked than true content.A key takeaway is that most of the shared links we encounter in Facebook are shared without first being read by the person sharing them, Sundar explained. This tells us that social media users and simply glancing at the headline and the blurb when deciding to blast a news link to their networks. Such dissemination can have a multiplicative effect and result in rapid spread of information to millions of folks online. This can result in vitality of misinformation, spreading fake news and conspiracy theories.The researchers observed another clear pattern: the more politically extreme the content, the more likely it was shared without being clicked. This trend held true for users across the political spectrum. In other words, whether content was strongly liberal or conservative, it attracted more superficial sharing compared to neutral content.Users were more likely to share content that aligned with their political beliefs. For example, liberals were more likely to share left-leaning content without clicking, while conservatives were more likely to share right-leaning content. This suggests that users rely on headlines that confirm their existing biases, potentially bypassing the need to engage with the full content.The more politically extreme the content is, the more it is shared without being clicked upon first, Sundar told PsyPost. This is true for both extreme left and extreme right. As we know, there tends to be a lot of strong opinions and biased commentary on the extremes of the political spectrum. As such, there is more scope for fake news and conspiracy theories masquerading as legitimate news in politically extreme news domains.In the dataset we accessed, there were 2,969 URLs that were fact-checked by a third party and determined to be false. The vast majority of these links were from conservative news domains and so unsurprisingly, we found that conservatives were five times more likely than liberals to share these links, most often without clicking on them and reading the false stories first. This suggests that if politically partisan users see a headline that seems aligned with their political ideology, they will readily share the story without bothering to verify if it is really true.The study highlights a concerning trend in how social media users interact with content. But it does have limitations. The analysis relied on aggregated data, meaning the researchers could not observe individual users behaviors directly. Some shares without clicks might still reflect deliberate actionsfor example, resharing familiar content without revisiting it.Additionally, the study focused only on Facebook, so it remains unclear whether similar patterns exist on other platforms like Twitter or Instagram. Future research could explore these behaviors on a broader scale and examine how different devices, such as mobile phones versus computers, influence users sharing habits.The researchers suggest that these findings have significant implications for both social media platforms and users. Social media interfaces could be redesigned to encourage more deliberate sharing. For instance, platforms could implement prompts reminding users to read an article before sharing it or provide indicators showing whether a link has been clicked. These interventions could reduce the spread of misinformation and promote more thoughtful engagement with news content.If platforms implement a warning that the content might be false and make users acknowledge the danger in doing so, that might help people think before sharing, Sundar said.The study, Sharing without clicking on news in social media, was authored by S. Shyam Sundar, Eugene Cho Snyder, Mengqi Liao, Junjun Yin, Jinping Wang, and Guangqing Chi. Social Media Digital maturity is associated with greater social connectedness among teensDigitally mature teens report feeling more socially connected, which is associated with engaging with real-life friends online and prioritizing compassionate goals. Read moreDetails Developmental Psychology Reducing screen time boosts childrens mental health and prosocial behaviors, study findsReducing leisure screen time for two weeks improved childrens mental health by decreasing emotional and peer-related difficulties while boosting positive social behaviors, highlighting the benefits of taking short breaks from screen media use. Read moreDetails Mental Health Internet use linked to better mental health for older adultsInternet use among adults aged 50 and older is associated with fewer depressive symptoms, higher life satisfaction, and better self-reported health, with benefits varying by frequency, duration, and individual or cultural factors. Read moreDetails Racism and Discrimination Groundbreaking hate universe study reveals links to billions in mainstream communitiesDuring the 2020 U.S. election, online hate networks became more interconnected and reached billions in mainstream communities, spreading hate speech further and strengthening their structure through shared content and interactions. Read moreDetails ADHD Beware of #adhdtest: Inaccurate ADHD content is dominating TikTok, study findsA recent study found 92% of TikToks #adhdtest videos were misleading, attracting nearly all user engagement, while accurate content struggled to gain attention, highlighting the platforms role in spreading misinformation. Read moreDetails Autism YouTubes role in shaping autism perceptions explored in new studyYouTube videos on autism primarily provide educational content with balanced perspectives, reducing stigmatization, though viewer comments often remain negative. This highlights YouTubes dual role in improving understanding while revealing persistent public misconceptions. Read moreDetails Social Media People who engage in doomscrolling are more likely to also engage in celebrity worshipA study found that doomscrolling and celebrity worship are linked across cultures, with both behaviors associated with increased anxiety and reduced well-being, suggesting they may reflect shared psychological challenges in media consumption. Read moreDetails Racism and Discrimination Racial bias in social media connections crosses political lines, study findsIn a field experiment on Twitter, researchers discovered that both liberal and conservative users were less likely to follow back Black accounts compared to White ones, challenging common assumptions about political differences in racial bias. Read moreDetails
0 Commentaires 0 Parts 16 Vue