Chinese company ByteDance and its popular social media platform TikTok are facing a wrongful death suit in New York over the 2022 death of a teenager. The suit alleges that TikTok sent thousands of unsolicited suicidal videos to 16-year-old Chase Nasca, who was killed when he apparently stepped in front of a train after he messaged a friend, “I can’t do it anymore.”
The suit was filed on behalf of Nasca’s family by the Social Media Victims Law Center in the Supreme Court of New York’s Suffolk County. The suit was announced March 22 before TikTok CEO Shou Zi Chew’s scheduled appearance today in Washington.
The lawsuit seeks to hold TikTok responsible for “directly causing the mental condition and the resulting death of Chase Nasca, who died at the age of 16 after being targeted, overwhelmed, and actively goaded and encouraged by TikTok’s content delivery and recommendation products, programming decisions, and content to which TikTok itself actively and materially contributed and encouraged for its own economic gain.”
The lawsuit also seeks to hold the area’s transportation agencies and town of Islip responsible for “creating a serious and foreseeable risk of harm to the plaintiff which led to his death.”
According to the complaint, some of the videos TikTok directed to Nasca suggested that young people should end their lives by stepping in front of a moving train. Shortly after receiving one such video, Nasca accessed tracks for the Long Island Railroad near his home where he was struck by the train and killed.
The Nascas seek unspecified damages including for pain and suffering and punitive damages. They also seek injunctive relief ordering TikTok to stop the alleged harmful conduct, remedy its “unreasonably dangerous recommendation technologies” in its social media products, and provide warnings to minor users and their parents that TikTok’s social media product is “addictive and poses a clear and present danger to unsuspecting minors.”
The suit alleges that TikTok started to direct Nasca to “dangerous and harmful accounts that promoted highly depressive, violent, self-harm and suicide-themed” content in 2021, including adult accounts which “offered depressing and violent material a minor should not have been allowed to see.”
Also, the suit claims, the content creators use TikTok’s “dark, suicide-themed songs” to make their videos more impactful, as well as trending hashtags to “ensure maximum amplification.”
“We are seeking to hold TikTok accountable for engaging in dangerous and harmful practices that put our children at risk of self-harm all in the name of ‘engagement’ to increase their ad revenues,” said Matthew P. Bergman, founding attorney of the Social Media Victims Law Center, in a press statement.
“To maximize user engagement and increase profits, TikTok creates and co-creates harmful content and deliberately targets children in the United States with violent, dangerous, extreme and psychologically disturbing content from which they cannot look away.”
The lawsuit also accuses the Metropolitan Transportation Authority (MTA), MTA Long Island Railroad and the town of Islip of negligence that created a dangerous and unfenced environment around the train tracks. According to the lawsuit, over the last 20 years, at least 30 people have been struck and killed by trains within a half mile of the same intersection in cases involving both suicide and accidental death.
According to Bergman, in China’s version of TikTok, minors 14 and younger are limited to 45 minutes per day online and are directed to science experiments, museum exhibits, patriotic and educational videos. “While the United States government has primarily been focused on protecting our national security, they need to focus more on protecting our nation’s children,” he said.
The Social Media Victims Law Group has filed other lawsuits against TikTok including several over the deaths of young people who died after attempting to complete the dangerous “blackout challenge” that was featured in TikTok videos. The blackout challenge involves using objects to strangle oneself to the point of losing consciousness.
The center contends that TikTok’s platform design is inherently flawed in how it accepts users onto its platform, selects which content to present to users, and uses an addictive algorithm to maximize user screen time.
In the U.S., according to the complaint, TikTok claims that its proprietary algorithm is “a recommendation system that delivers content to each user that is likely to be of interest to that particular user…each person’s feed is unique and tailored to that specific individual.” In other words, TikTok claims to individually curate and identify videos it believes will be of interest to its users.
However, the complaint alleges that TikTok is not making recommendations to U.S. children; rather, “it is directing vulnerable youth to the subject matters TikTok selects, whether they want to see that content or not; and it is selecting content that, in many cases, they do not actually want to see.”
TikTok “knows that violent, dangerous, extreme, and psychologically disturbing content triggers a greater dopamine response in minors than safe and benign content,” the complaint alleges.
The Seattle-based firm has also sued other social media platforms including SnapChat alleging harm to minors.