Roblox is launching an open-source AI system named Sentinel, designed to detect predatory language in chats, amidst lawsuits alleging insufficient child protection. In the first half of 2025, Roblox submitted 1,200 reports of potential exploitation to the National Center for Missing and Exploited Children. Chief Safety Officer Matt Kaufman stated that identifying child endangerment requires analyzing longer conversations, enhancing their detection capabilities. This proactive measure aims to safeguard over 111 million monthly users.