WWW.THEVERGE.COM
Meta is leaving its users to wade through hate and disinformation
Experts warn that Metas decision toend its third-party fact-checking programcould allow disinformation and hate to fester online and permeate the real world.The company announced today that its phasing out a program launched in 2016 where it partners with independent fact-checkers around the world to identify and review misinformation across its social media platforms. Meta is replacing the program with a crowdsourced approach to content moderation similar to Xs Community Notes.Meta is essentially shifting responsibility to users to weed out lies on Facebook, Instagram, Threads, and WhatsApp, raising fears that itll be easier to spread misleading information about climate change, clean energy, public health risks, and communities often targeted with violence.Its going to hurt Metas users firstIts going to hurt Metas users first because the program worked well at reducing the virality of hoax content and conspiracy theories, says Angie Drobnic Holan, director of the International Fact-Checking Network (IFCN) at Poynter. A lot of people think Community Notes-style moderation doesnt work at all and its merely window dressing so that platforms can say theyre doing something ...most people do not want to have to wade through a bunch of misinformation on social media, fact checking everything for themselves, Holan adds. The losers here are people who want to be able to go on social media and not be overwhelmed with false information.In a video, Meta CEO Mark Zuckerberg claimed the decision was a matter of promoting free speech while also calling fact-checkers too politically biased. Meta also said that its program was too sensitive and that 1 to 2 out of every 10 pieces of content it took down in December were mistakes and might not have actually violated company policies.Holan says the video was incredibly unfair to fact-checkers who have worked with Meta as partners for nearly a decade. Meta worked specifically with IFCN-certified fact-checkers who had to follow the networks Code of Principles as well as Metas own policies. Fact-checkers reviewed content and rated its accuracy. But Meta not fact-checkers makes the call when it comes to removing content or limiting its reach.Poynter owns PolitiFact, which is one of the fact-checking partners Meta works with in the US. Holan was the editor-in-chief of PolitiFact before stepping into her role at IFCN.That process covers a broad range of topics, from false information about celebrities dying to claims about miracle cures, Holan notes. Meta launched the program in 2016 with growing public concern around the potential for social media to amplify unverified rumors online, like false stories about the pope endorsing Donald Trump for president that year.Metas decision looks more like an effort to curry favor with President-elect Trump. In his video, Zuckerberg described recent elections as a cultural tipping point toward free speech. The company recently named Republican lobbyist Joel Kaplan as its new chief global affairs officer and added UFC CEO and president Dana White, a close friend of Trump, to its board. Trump also said today that the changes at Meta were probably in response to his threats.Zucks announcement is a full bending of the knee to Trump and an attempt to catch up to [Elon] Musk in his race to the bottom. The implications are going to be widespread, Nina Jankowicz, CEO of the nonprofit American Sunlight Project and an adjunct professor at Syracuse University who researches disinformation, said in a post on Bluesky. Twitter launched its community moderation program, called Birdwatch at the time, in 2021, before Musk took over. Musk, who helped bankroll Trumps campaign and is now set to lead the incoming administrations new Department of Government Efficiency, leaned into Community Notes after slashing the teams responsible for content moderation at Twitter. Hate speech including slurs against Black and transgender people increased on the platform after Musk bought the company, according to research by the Center for Countering Digital Hate. (Musk then sued the center, but a federal judge dismissed the case last year.) Advocates are now worried that harmful content might spread unhindered on Metas platforms. Meta is now saying its up to you to spot the lies on itsplatforms, and that its not their problem if you cant tell the difference, even if those lies, hate, or scams end up hurting you, Imran Ahmed, founder and CEO of the Center for Countering Digital Hate, said in an email. Ahmed describes it as a huge step back for online safety, transparency, and accountability and says it could have terrible offline consequences in the form of real-world harm.By abandoning fact-checking, Meta is opening the door to unchecked hateful disinformation about already targeted communities like Black, brown, immigrant and trans people, which too often leads to offline violence, Nicole Sugerman, campaign manager at the nonprofit Kairos that works to counter race- and gender-based hate online, said in an emailed statement to The Verge today. RelatedMetas announcement today specifically says that its getting rid of a number of restrictions on topics like immigration, gender identity and gender that are the subject of frequent political discourse and debate.Scientists and environmental groups are wary of the changes at Meta, too. Mark Zuckerbergs decision to abandon efforts to check facts and correct misinformation and disinformation means that anti-scientific content will continue to proliferate on Meta platforms, Kate Cell, senior climate campaign manager at the Union of Concerned Scientists, said in an emailed statement. I think this is a terrible decision ... disinformations effects on our policies have become more and more obvious, says Michael Khoo, a climate disinformation program director at Friends of the Earth. He points to attacks on wind power affecting renewable energy projects as an example. Khoo also likens the Community Notes approach to the fossil fuel industrys marketing of recycling as a solution to plastic waste. In reality, recycling has done little to stem the tide of plastic pollution flooding into the environment since the material is difficult to rehash and many plastic products are not really recyclable. The strategy also puts the onus on consumers to deal with a companys waste. [Tech] companies need to own the problem of disinformation that their own algorithms are creating, Khoo tells The Verge.
0 Kommentare
0 Anteile
50 Ansichten