UXDESIGN.CC
The Meta decision: you cant put the toothpaste back in the tube
You can relinquish fact-checking responsibilities but can you ever be neutralagain?Photo sourceIn recent years weve developed increasingly complex systems to manage content across our platforms, partly in response to societal and political pressure to moderate content. This approach has gone too far. As well-intentioned as many of these efforts have been, they have expanded over time to the point where we are making too many mistakes We want to undo the mission creep that has made our rules too restrictive and too prone to over-enforcement.- Excerpt from Metas press release on January 7, 2025 (emphasis mine)Is it possible for a post to be so wrong but also kind of right at the same time? Of course it is. Nuance is a dying art. But thats the challenge with deciding whats right and wrong: things can be completely true, partially true, or not true atall.Unfortunately, when youve claimed the word meta from our lexicon and then you announce that youve done a bad job deciding whats right, so youve decided to stop deciding whats right, but you get that decision wrong, were out of words to describe theirony.Thats effectively what Meta has done in their announcement this week to stop using independent fact-checkers on their platforms and instead shift to a crowdsourced community notes model, similar to Twitter Xs approach. The public response has been palpable. In the same week that the incoming president-elect repeatedly trolls about wanting to annex Canada, a culture war is brewing over raw milk consumption, and conspiracy theories about California wildfires are spreading as fast as the fires themselves, this seems like the absolute wrong time to step away from fact-checking.At the same time, I can empathize with Metas statement. I design technology for a living. I have to make tough ethical decisions. I encounter scope creep and mission creep all the time. I, too, have approached a problem with good intentions only to find an adverse unintended consequence. In their own words, we didnt want to be the arbiters of truth and I wouldnt want to entrust them with that responsibility either.Many product decisions are unidirectional. Ive had to shelve some of my riskier design ideas because consumer trust is a delicate matteronce you release a feature to the public you cant always pull it back. But then again, Im not usually the type to move fast and breakthings.Not every ethical dilemma carries the same weight, and this one is heavy. I cant blame Meta for initially wanting to stay neutral, but when they decided in 2016 to use third-party fact checking to moderate content across their platforms, they altered the social fabric of the internet in a way they can never fully takeback.Once youve had a finger on the scale of truth, any omission of fact-checking becomes a permission to lie. The misinformation is already out there and you cant put that toothpaste back in thetube.You used to be able to lie on theinternetDoes it make me sound old to say the internet used to be a different place? In the halcyon days of the early 2000s, the internet still felt largely like the Wild West. People werent constantly online, identities were still mostly anonymous, and communities were spread thinner across esoteric websites and interests. You almost expected anything you read online to be a lie, and sometimes that was half of thefun.Im pretty sure I posted this on MySpace and thought it was hilariousSo what changed? For one thing, as we began to spend more time online, we moved more of our IRL social lives to the internet, which made it beneficial for everybody to be in the same few places. Aggregators began to distill the best content from corners of the internet into just a few destinations. Memes transformed from inside joke shibboleths to a shared cultural identity. And going viral went from an innocent, seemingly random phenomenon to a carefully calculated, focus-grouped, business proposition.In other words, we became a captive audience and people learned that they could profit off our attention.The internet became too legit toquitBy the early 2010s, things on the internet began to matter. By then, seemingly everyone had a digital presence; if you werent there you were probably missingout.People started to take notice when online movements proved they could mobilize people and ideas in powerful ways. 2011s Arab Spring revolutions proved that Twitter X was more than just a place to talk about your lunch. Similar movements around the world followed. Even fringe phenomena like Twitch Plays Pokemon demonstrated that the internet hivemind was more than just a theory or a joke. But any scenario where that could be used for good meant that it could also be used forbad.The timeline followed pretty swiftly from Facebooks 2012 social experiment about manipulating emotions to Cambridge Analyticas social media influence in the 2016 election. In the same blink of an eye, fake news entered our everyday vocabulary and desensitized us while sites like InfoWars lost all touch withreality.Lying on the internet was no longer fun. By the late 2010s you were more likely to be pulling your friend out of a pyramid scheme or worrying that your parents would fall for a cryptoscam.We wanted the truth. We couldnt handle thetruth.Can any one entity really be an unbiased judge of the truth? In hindsight, you might wonder why anyone would willingly step into the morass of content moderation, but in the context of the 2010s you can understand why Metaand its peershad to step in. Since 2016, Meta has had mechanisms in place to proactively flag content thats known to be false or bury content that aligns with hoaxes. Facebook automatically flags and removes posts and comments that share similarities with hate speech. While the policies were in place, Meta claimed that they were working as intended, but this weeks announcement contradicts that. (See how the truth canchange?)Then again, its now 2025, content moderation policies have been in place for nine years, and anecdotally Im not sure that I feel any more insulated from fake news than I did before. But I can assume that Im already in a media-literate bubble and not likely to encounter as much fake news in the first place. Those outside my bubble might see more flagged content, but that only fuels the fire among those who believe that content moderation is biased against their point of view. Conspiracy theories are only strengthened by the idea that they dont want you to see them. At the end of the day, how effective is fact-checking among the willfully ignorant?Omission becomes permissionWhether the previous fact-checking mechanisms were effective or not, you cant simply remove the mechanisms without creating a vacuum. It would be different if there had never been a moderation playbook in the first place, but thats not the case. Since, by declaration, these new rules are relaxed, then that becomes a vulnerability to anyone who would want to exploit it. A community notes approach can only do so much when fake news proponents can now kick down all the doors that used to hold them atbay.Worse yet, by relaxing these rules, Meta has redefined its Community Standards and, in the process, explicitly defined new forms of hate speech that are acceptable. Again, it would be one thing if these examples of hate speech were never defined in the first place, but by walking back from one moral stance to another, it provides a list of socially permissible ways to bully or harass a formerly protected class of people. Among that list, ethnic groups can now be called filth, women can be referred to as household objects, and LGBT individuals can be called mentally ill (and are notably the only exception among disallowed mental condition insults).If youve ever met a bully or troll, you already know that when a boundary is (re)drawn, they will crowd that line as much as they can get awaywith.What now?The sociologist in me reminds me to have faith in both humans and academia.If I can set aside my cynicism, I can remember that most people are not bad-faith actors. Most people are not willfully ignorant. Most people do want the real news. Those are the people who usually sort things out for themselves pretty well. In the absence of independent fact-checkers it becomes even more imperative that we do our own research and call out the bullshit when we see it. I dont agree that community notes are the best approach, but they work better if good people contribute to them. And if youve read this far into this essay then youre probably one ofthem.And now, more than ever, I appreciate the important research of Professor Kate Starbird at the UW Center for an Informed Public, which has been studying, tracking, and understanding the spread of false information long before it was cool. Theres always a strong source of truth in peer-reviewed research, even if you have to seek it out for yourself.The Meta decision: you cant put the toothpaste back in the tube was originally published in UX Collective on Medium, where people are continuing the conversation by highlighting and responding to this story.
0 Comentários 0 Compartilhamentos 32 Visualizações