In a concerted effort to enhance online safety for children, technology giants Roblox, Discord, OpenAI, and Google have jointly established the Robust Open Online Safety Tools (ROOST) initiative. This non-profit organization aims to provide free, open-source AI tools designed to identify, review, and report child sexual abuse material (CSAM), thereby making core safety technologies more widely accessible across the industry.
The development of ROOST has been significantly influenced by the dynamics introduced by advancements in generative AI, which have transformed online environments. According to Eric Schmidt, a founding partner of ROOST and former CEO of Google, the initiative seeks to fulfill “a critical need to accelerate innovation in online child safety.” Although details on the AI-driven CSAM detection tools remain limited, they are expected to leverage large language models and consolidate existing solutions for managing offensive content.
ROOST’s strategy is to create a platform centered on child protection, with a collaborative and open-source methodology. This approach is designed to promote innovation and enhance the transparency, accessibility, and inclusivity of essential safety infrastructures, ultimately fostering a safer internet experience for all users.
The announcement of ROOST comes as major technology and social media firms face increasing regulatory pressure to improve child safety on their platforms. In this climate, the push for self-regulation through innovative solutions like ROOST is crucial. The National Center for Missing and Exploited Children (NCMEC) reported a 12 percent rise in suspected child exploitation from 2022 to 2023, further underscoring the urgency of these efforts.
Roblox, known for having more than half of U.S. children on its platform by 2020, along with Discord, has been criticized for inadequacies in preventing child sexual exploitation and inappropriate content exposure. Both companies faced legal action in 2022 for allegedly allowing unsupervised adult-to-child interactions. With ROOST, these companies are committed to addressing these challenges head-on.
ROOST’s founding members are not only providing financial backing but also offering technical expertise and tools to the initiative. ROOST aims to partner with AI foundation model developers to create a “community of practice” dedicated to content safeguards. This will involve offering vetted AI training datasets and identifying safety gaps.
ROOST plans to make existing tools more accessible by integrating detection and reporting technologies from its member organizations into a cohesive solution, simplifying adoption for other companies. Roblox’s vice president of engineering, trust, and safety, Naren Koneru, suggests that ROOST might host AI moderation systems, accessible via API calls, though details are not yet fully specified.
Contributions from Discord and other members will enhance existing projects like the Lantern cross-platform information-sharing initiative, involving Meta and Google. Additionally, Roblox plans to open-source its AI model for detecting inappropriate content later this year, potentially integrating it into the ROOST framework.
Besides participating in ROOST, Discord has launched an “Ignore” feature to help users manage unwanted communications discreetly. Clint Smith, Discord’s Chief Legal Officer, emphasized the company’s dedication to internet safety, especially for young users, stating, “We’re committed to making the entire internet - not just Discord - a better and safer place, especially for young people.”
With over $27 million raised for its operations over the next four years, ROOST is supported by various philanthropic and expert organizations, including the McGovern Foundation, Future of Online Trust and Safety Fund, Knight Foundation, and the AI Collaborative. Additionally, the organization benefits from the expertise of professionals in child safety, artificial intelligence, open-source technology, and countering violent extremism.
```