Roblox, Discord, OpenAI, and Google found new child safety group
www.theverge.com
Google, OpenAI, Roblox, and Discord have formed a new non-profit organization to help improve child safety online. The Robust Open Online Safety Tools (ROOST) initiative aims to make core safety technologies more accessible for companies and provide free, open-source AI tools for identifying, reviewing, and reporting child sexual abuse material.The initiative was partially motivated by changes that generative AI advancements have made to online environments and aims to address a critical need to accelerate innovation in online child safety, according to founding ROOST partner and former Google CEO Eric Schmidt. Details about the CSAM detection tools are slim beyond that they will utilize large language AI models and unify existing options for dealing with the content.Starting with a platform focused on child protection, ROOSTs collaborative, open-source approach will foster innovation and make essential infrastructure more transparent, accessible, and inclusive, with the goal of creating a safer internet for everyone, said Schmidt.The ROOST announcement comes amid a huge regulatory battle regarding child safety on social media and online platforms, with companies seeking to appease lawmakers with self-regulation methods.The National Center for Missing and Exploited Children (NCMEC) reports that suspected child exploitation increased by 12 percent between 2022 and 2023. As of 2020, over half of US children were on Roblox, and the company has been repeatedly criticized for failing to tackle child sexual exploitation and exposure to inappropriate content on its platform. Roblox and Discord were also singled out in a social media lawsuit filed in 2022 that alleged the platforms failed to stop adults from messaging children without supervision.Some founding members of ROOST like Discord are providing funding, while others are opening their tools or expertise to the project. ROOST says its partnering with leading AI foundation model developers to build a community of practice for content safeguards, which will include providing vetted AI training datasets and identifying gaps in safety.The initiative says it will be making tools that already exist more accessible, effectively combining various detection and reporting tech from its member organizations into a unified solution thats easier for other companies to implement. Naren Koneru, Robloxs vice president of engineering, trust, and safety, told Fast Company that ROOST may host AI moderation systems that companies can integrate through API calls. Theres some ambiguity about what ROOSTs AI moderation tools will include, however.For example, Discord says its contributions will build on the Lantern cross-platform information-sharing project it joined in 2023 alongside Meta and Google. It could also include an updated version of Robloxs AI model for detecting profanity, racism, bullying, sexting, and other inappropriate content in audio clips, which the company is planning to open-source this year. Its unclear precisely how the tools will intersect with existing first-line CSAM detection systems like Microsofts PhotoDNA image analysis tool.Alongside its participation in ROOST, Discord has released a new Ignore feature that allows users to hide messages and notifications they receive without notifying the people they have muted. At Discord, we believe that safety is a common good, Discords Chief Legal Officer Clint Smith said in the ROOST announcement. Were committed to making the entire internet - not just Discord - a better and safer place, especially for young people. ROOST has raised more than $27 million to support its first four years of operations, backed by philanthropic organizations including the McGovern Foundation, Future of Online Trust and Safety Fund, Knight Foundation, and the AI Collaborative. The ROOST organization will also be supported by experts in child safety, artificial intelligence, open-source technology, and countering violent extremism according to the press release.See More:
0 Comments ·0 Shares ·33 Views