www.psypost.org
(Photo credit: Adobe Stock)TikTok, a widely used social media platform with over a billion active users worldwide, has become a key source of news, particularly for younger audiences. This growing influence has raised concerns about potential political biases in its recommendation algorithm, especially during election cycles. A recent preprint study examined this issue by analyzing how TikToks algorithm recommends political content ahead of the 2024 presidential election. Using a controlled experiment involving hundreds of simulated user accounts, the study found that Republican-leaning accounts received significantly more ideologically aligned content than Democratic-leaning accounts, while Democratic-leaning accounts were more frequently exposed to opposing viewpoints.TikTok has become a major force among social media platforms, boasting over a billion monthly active users worldwide and 170 million in the United States. It has also emerged as a significant source of news, particularly for younger demographics. This has raised concerns about the platforms potential to shape political narratives and influence elections.Despite these concerns, there has been limited research investigating TikToks recommendation algorithm for political biases, especially in comparison to extensive research on other social media platforms like Facebook, Instagram, YouTube, X (formerly Twitter), and Reddit.We previously conducted experiments auditing YouTubes recommendation algorithms. This study published at PNAS Nexus demonstrated that the algorithm exhibited a left-leaning bias in the United States, said Yasir Zaki, an assistant professor of computer science at New York University Abu Dhabi.Given TikToks widespread popularityparticularly among younger demographicswe sought to replicate this study on TikTok during the 2024 U.S. presidential elections. Another motivation was the concerns over TikToks Chinese ownership led many U.S. politicians to advocate for banning the platform, citing fears that its recommendation algorithm could be used to promote a political agenda.To examine how TikToks algorithm recommends political content, the researchers designed an extensive audit experiment. They created 323 sock puppet accountsfake accounts programmed to simulate user behavioracross three politically diverse states: Texas, New York, and Georgia. Each account was assigned a political leaning: Democratic, Republican, or neutral (the control group).The experiment consisted of two stages: a conditioning stage and a recommendation stage. In the conditioning stage, the Democratic accounts watched up to 400 Democratic-aligned videos, and the Republican accounts watched up to 400 Republican-aligned videos. Neutral accounts skipped this stage. This was done to teach TikToks algorithm the political preferences of each account.In the recommendation stage, all accounts watched videos on TikToks For You page, which is the platforms main feed of recommended content. The accounts watched 10 videos, followed by a one-hour pause, and repeated this process for six days. Each experimental run lasted one week. The researchers collected data on approximately 394,000 videos viewed by these accounts between April 30th and November 11th, 2024.To analyze the political content of the recommended videos, the researchers downloaded the English transcripts of videos when available (22.8% of unique videos). They then used a system involving three large language modelsGPT-4o, Gemini-Pro, and GPT-4to classify each video. The language models answered questions about whether the video was political, whether it concerned the 2024 U.S. elections or major political figures, and what the ideological stance of the video was (pro-Democratic, anti-Democratic, pro-Republican, anti-Republican, or neutral). The majority vote of the three language models was used as the final classification for each question.The analysis uncovered significant asymmetries in content distribution on TikTok. Republican-seeded accounts received approximately 11.8% more party-aligned recommendations compared to Democratic-seeded accounts. Democratic-seeded accounts were exposed to approximately 7.5% more opposite-party recommendations on average. These differences were consistent across all three states and could not be explained by differences in engagement metrics like likes, views, shares, comments, or followers.We found that TikToks recommendation algorithm was not neutral during the 2024 U.S. presidential elections, explained Talal Rahwan, an associate professor of computer science at New York University Abu Dhabi. Across all three states analyzed in our study, the platform consistently promoted more Republican-leaning content. We showed that this bias cannot be explained by factors such as video popularity and engagement metricskey variables that typically influence recommendation algorithms.Further analysis showed that the bias was primarily driven by negative partisanship content, meaning content that criticizes the opposing party rather than promoting ones own party. Both Democratic- and Republican-conditioned accounts were recommended more negative partisan content, but this was more pronounced for Republican accounts. Negative-partisanship videos were 1.78 times more likely to be recommended as an ideological mismatch relative to positive-partisanship ones.We observed a bias toward negative partisanship in TikToks recommendations, Zaki noted. Regardless of the political partyDemocratic or Republicanthe algorithm prioritized content that criticized the opposing party over content that promoted ones own party.The researchers also examined the top Democratic and Republican channels on TikTok by follower count. Republican channels had a significantly higher mismatch proportion, meaning their videos were more likely to be recommended to accounts with an opposite political leaning. Notably, videos from Donald Trumps official TikTok channel were recommended to Democratic-conditioned accounts nearly 27% of the time, while Kamala Harriss videos were recommended to Republican-conditioned accounts only 15.3% of the time.Finally, the researchers analyzed the topics covered in partisan videos. Topics stereotypically associated with the Democratic party, like climate change and abortion, were more frequently covered by Democratic-aligned videos. Topics like immigration, foreign policy, and the Ukraine war were more frequently covered by Republican-aligned videos. Videos on immigration, crime, the Gaza conflict, and foreign policy were most likely to be recommended as ideological mismatches to Democratic-conditioned accounts.To build on this work, future research could explore how TikToks algorithm behaves across different election cycles, investigate how misinformation is distributed within partisan content, and compare TikToks political content recommendations with those of other major platforms. Additionally, studies incorporating real user data alongside automated experiments could provide a more comprehensive understanding of how individuals experience political content on TikTok. Given the platforms growing role in shaping public discourse, continued scrutiny of its recommendation system will be essential for assessing its impact on political knowledge and voter decision-making.We want to address fundamental questions about the neutrality of social media platforms, Rahwan said.The study, TikToks recommendations skewed towards Republican content during the 2024 U.S. presidential race, was authored by Hazem Ibrahim, HyunSeok Daniel Jang, Nouar Aldahoul, Aaron R. Kaufman, Talal Rahwan, and Yasir Zaki. Authoritarianism Right-wing authoritarianism linked to perceived threat from minoritized groups, but national context mattersIndividuals with strong right-wing authoritarian beliefs are more likely to perceive minoritized groups as a threat. This relationship was weaker in countries with higher religiosity or social marginalization, highlighting the influence of sociocultural context on authoritarian attitudes. Read moreDetails Authoritarianism Algorithmic manipulation? TikTok use predicts positive views of Chinas human rights recordNew research shows TikTok users encounter less content critical of China and more content aligned with pro-CCP narratives compared to other platforms. Heavy TikTok users also report more favorable views of Chinas human rights record. Read moreDetails Donald Trump Identity fusion with Trump reinforced his election fraud claims and narratives of victimhoodTrump supporters with strong personal loyalty (identity fusion) were more likely to believe his election fraud claims, which further deepened their loyalty and led to greater support for his policies and dismissal of his legal troubles. Read moreDetails Mental Health TikTok tics study sheds light on recovery trends and ongoing mental health challengesA study found that pandemic-related stress and TikTok exposure contributed to a surge in functional tic-like behaviors among adolescents, with most improving after restrictions eased, though many faced ongoing mental health and psychosocial challenges. Read moreDetails Political Psychology Conservatives share more false claims in polarized settings, research revealsConservatives are more likely than liberals to share misinformation in politically polarized settings, driven by a desire for ingroup dominance, but this behavior diminishes in less polarized environments where both groups act similarly. Read moreDetails Political Psychology A nation exhausted: The neuroscience of why Americans are tuning outpoliticsPolitical fear, media polarization, and identity-driven politics have fueled exhaustion and learned helplessness in Americans, leading many to disengage from political news. Read moreDetails Authoritarianism Adolescents with authoritarian leanings exhibit weaker cognitive ability and emotional intelligenceTeens with lower intelligence and emotional abilities are more likely to support authoritarian beliefs, whether left- or right-wing, suggesting shared psychological traits underpin rigid, authority-driven ideologies across the political spectrum. Read moreDetails Political Psychology Postmodern beliefs linked to left-wing authoritarianismResearchers found that postmodern ideals are linked to authoritarian attitudes, but the connection weakens as psychological distress increases, contrary to expectations. Read moreDetails