Roblox sued by Southern California families alleging children met predators on its platform

Video game platform Roblox is facing more lawsuits from parents who claim the San Mateo, Calif., company isn’t doing enough to protect children from sexual predators.
In a lawsuit filed in November, an anonymous Los Angeles County mother alleges that her daughter met a predator on Roblox and that the predator convinced her child to send sexually explicit photos of herself through the social media platform Discord. The woman is suing both Roblox and the San Francisco company Discord.
The woman thought Roblox was safe because it was marketed to children and was educational when her daughter signed up for the gaming platform last year when she was 12, according to the lawsuit filed in Los Angeles County Superior Court.
But the daughter later befriended a person on Roblox known as “Precious,” who claimed to be 15 years old and told the child she was being abused at home and had no friends, the lawsuit said. The daughter met a Roblox user at a beach with a friend’s parents, who appeared to be older and tried to introduce her to a group of older men.
After they met, the hunter tried to persuade the girl to visit her Fullerton apartment alone and tried to alienate her from her family. According to the lawsuit, the boy suffered from psychological trauma, depression and other emotional distress due to his experiences on Roblox and Discord.
The lawsuit accuses Roblox and Discord of prioritizing profit over safety, creating a “digital” and “real-life nightmare” for children. It is also claimed that the companies’ failures are systematic and that other children have also suffered from encounters with predators on the platforms.
“His innocence was taken away and his life will never be the same again,” the lawsuit said.
Roblox said in a statement that it was “deeply disturbed by any incident that endangered any user” and prioritized online security.
“We also recognize that no system is perfect, which is why we are constantly working to further improve our safety tools and platform restrictions to ensure parents can trust us to keep their children safe online, launching 145 new initiatives this year alone,” the statement said.
Discord said it cares about security and requires users to be at least 13 years old to use its platform.
“We maintain strong systems to prevent the spread of sexual exploitation and harassment on our platform, and we also work with other technology companies and security organizations to improve online security across the internet,” the company said in a statement.
The lawsuit is the latest scrutiny facing Roblox, a platform popular with teens. More than 151 million people use it every day. Earlier this year, the platform faced a wave of lawsuits from people in several states who claimed predators were posing as children on the platform and sexually exploiting them.
NBC4 NewsHaving previously reported on the lawsuit, it also reported that Roblox is facing another lawsuit from a Riverside, California family that claims their child was sexually assaulted by a man they met on Roblox. That man was sentenced to 15 years in prison.
Roblox is taking new steps this year to address growing child safety concerns. In November, the company announced that it would require users to verify their age before they can chat with other players. Roblox users will provide an ID or take a video selfie to verify their age. The verification feature estimates a person’s age, allowing the company to limit conversations between children and adults.
The lawsuit filed by the Los Angeles County woman stated that the security changes made by Roblox in 2024 were “woefully inadequate” and that they were made “too late.”
“All of these changes could have been implemented years ago,” the lawsuit said. “None of them involve any new or groundbreaking technology. Roblox only made progress when its stock was threatened.”




