Roblox, Discord, OpenAI and Google found new child safety group

Engadget 

Roblox, Discord, OpenAI and Google are launching a nonprofit organization called ROOST, or Robust Open Online Safety Tools, which hopes "to build scalable, interoperable safety infrastructure suited for the AI era." The organization plans on providing free, open-source safety tools to public and private organizations to use on their own platforms, with a special focus on child safety to start. The press release announcing ROOST specifically calls out plans to offer "tools to detect, review, and report child sexual abuse material (CSAM)." Partner companies are providing funding for these tools, and the technical expertise to build them, too. The operating theory of ROOST is that access to generative AI is rapidly changing the online landscape, making the need for "reliable and accessible safety infrastructure" all the more urgent.