Economy

Roblox and Discord hit with another lawsuit after teen dies

Pinterest LinkedIn Tumblr

Editor’s Note: This article contains discussions of suicide. Reader discretion is advised. If you or someone you know is struggling with thoughts of suicide, you can find resources in your area on the National Crisis Line website or by calling 988.

SAN FRANCISCO (KRON) — A new lawsuit filed against Roblox and Discord accuses the online gaming companies of causing a 13-year-old Kentucky girl’s death.

The wrongful death lawsuit filed by law firm Anapol Weiss states that Audree Heine killed herself after months of “manipulation and extremist grooming” from users on Roblox and Discord.

The case, Seitz v. Roblox Corporation and Discord Inc., alleges Audree was manipulated by a Roblox community “dedicated to glorifying violence and emulating notorious mass shootings like Columbine.”


Delta flight to Salt Lake forced to turn around midflight due to ‘unpleasant odor’

Attorneys claim that Roblox, despite multiple child safety updates, has failed to implement effective moderation or age verification systems. Attorney Alexandra Walsh said, “The trauma that results is horrific, from grooming, to exploitation, to actual assault. In this case, a child lost her life. This needs to stop.”

Discord, headquartered in San Francisco, is an app for gamers looking to communicate with each other while playing video games. Roblox, headquartered in San Mateo, promotes its games to millions of users as “the ultimate virtual universe that lets you create, share experiences with friends, and be anything you can imagine.”

Audree was an avid Roblox user who relied heavily on the app for social interaction, the suit states. She lived in Boone County, Kentucky, and began using the app starting when she was eight years old.

Her mother set up parental controls on Audree’s Discord and Roblox accounts believing that the controls would ensure her daughter’s safety, the lawsuit states.


Reports of monkeys on the loose may be AI-driven: Officials

Audree was a “prime target for the communities of violent extremists who roam the app looking for socially isolated and vulnerable children. These include the True Crime Community, TCC, which idolizes mass shooters and overlaps with other violent and extremist ideologies. Through Roblox, Audree was exposed to emotional manipulation and social pressure by other users, including TCC members,” the lawsuit writes.

TCC encouraged violence against others and self-harm, according to the lawsuit.

Audree also joined the messaging app Discord, where social pressure was magnified through various chat rooms called “servers,” the suit states. She died in December 2024, one week after her 13th birthday.

Dozens of lawsuits have been filed against the two companies in recent years. Many parents of victimized children said they did not know anything was wrong until it was too late.

The suit filed on behalf of Audree’s mother claims that if Discord and Roblox had effective age verification systems and parental controls, Audree would still be alive.

The lawsuit states, “Time and again Defendants have refused to invest in basic safety features to protect against exactly the kind of harm that Audree suffered, leaving parents with no appropriate safeguards. Without these measures, cultures of peer pressure and violence have spread across Defendants’ platforms, increasingly pushing impressionable young users to harm others and themselves.”

Roblox and Discord Lawsuit Oct 2025Download

The lawsuit was filed Monday in U.S District Court Eastern District of Kentucky.