upper waypoint

Roblox and Discord Fail to Stop Sexual Predators From Exploiting Kids, Lawsuit Alleges

Save ArticleSave Article
Failed to save article

Please try again

The popular gaming and messaging platforms, both based in the Bay Area, are accused of misrepresenting their safety to children and their parents. (Jakub Porzycki/NurPhoto via Getty Images)

Attorneys representing a 13-year-old boy in a lawsuit against Bay Area tech companies Roblox and Discord are alleging that the platforms’ lack of safeguards allows predators to sexually exploit and abuse minors.

The lawsuit, filed last week in San Mateo County Superior Court, alleges that the video game and messaging companies’ misrepresentation of safety on their apps and certain aspects of their design allowed an adult user to sexually coerce the boy. As a result of their negligence, the child suffered extensive psychological harm, the lawsuit alleges.

According to the attorneys, sexual predators are able to use the popular Roblox game platform to meet and groom children, then move to Discord, where they can chat via text, voice and video messages.

“There’s a systemic failure of safety and a systemic set of misrepresentations,” said Alexandra Walsh, a partner at Anapol Weiss who is representing the 13-year-old in the lawsuit. “Horrifically, our client in this case is one child of many who have been grievously injured as a result of that.”

Sponsored

The boy, unnamed in the lawsuit, first created accounts on Roblox and Discord in 2023 with permission from his father, who was assured by the companies’ online posts that minors on their platforms are safe. Last year, the boy’s parents discovered disturbing messages on their child’s phone from an adult who was threatening him and demanding nude photos and sexually explicit content. Further investigation revealed that the boy had already sent pictures in exchange for Robux, the video game’s online currency, and had previously made plans to meet the man in person.

Shortly after, his parents reported the adult user, who attorneys identified as Sebastian Romero, 27, to law enforcement in New Jersey, where they lived at the time. After raiding Romero’s home, police also found reason to believe he was responsible for the sexual exploitation of more than 20 other minors, according to the lawsuit. Romero was charged last year with multiple sexual abuse and extortion charges.

Because the boy had previously shared his home address with Romero, he and his family have since moved across the country out of fear for their safety. The situation has resulted in both financial and emotional ruin, the lawsuit states.

Encrypted messaging services and other available features on Roblox and Discord allow predators to easily identify, contact and groom children, Walsh said. The companies receive thousands of complaints from parents and users every year, but that information isn’t shared publicly, and parents are thus led to believe that their children will be safe while playing, she added.

According to Walsh, attorneys on the case are also preparing a class action lawsuit against Roblox and have been in communication with other victims and parents who are demanding refunds. Many of these users have spent thousands of dollars on the video game, Walsh said.

“There are thousands of kids who’ve already been hurt,” Walsh said. “There are thousands of more kids who are at risk, and Roblox and Discord know exactly what’s happening on their platforms. They have the resources to protect children, but they’re prioritizing financial gain over the safety of our kids.”

The lawsuit is demanding that the companies provide the 13-year-old boy with financial compensation for the harm that’s been done to him by Romero.

A spokesperson for Roblox said the company cannot comment on ongoing litigation but that it “takes the safety of its community very seriously” and is introducing new safety measures. Discord did not respond to a request for comment.

It is not the first time that Roblox and Discord have faced legal action for their alleged negligence in protecting young users. The companies were sued in San Francisco County Superior Court by the Social Media Victims Law Center in 2022 after a young girl was sexually exploited by adult users who contacted her through the platforms’ direct messaging services.

Roblox has also faced public backlash for allowing users to engage in sexually explicit activity with little oversight. Children who are exposed to these games suffer from real psychological trauma, according to last week’s lawsuit.

The video game company introduced new safety measures to its platform last year, giving parents more control over what their children do on the apps and restricting direct messaging features for young users. According to the lawsuit, however, these changes are still not enough.

lower waypoint
next waypoint