Mother Alleges TikTok Challenge Led to Daughter's Death, Was Recommended by Its Algorithm

A 10-year-old who died last year participated in a so-called ‘blackout’ aka ‘choking’ challenge.

We may earn a commission from links on this page.
A young woman sits back on the couch holding a phone that has the TikTok logo on it.
Photo: phBodrova (Shutterstock)

The mother of a 10-year-old daughter who died last year is suing TikTok and its parent company ByteDance over allegations the company’s algorithm promoted a so-called “Blackout Challenge” to the child’s feed.

In a complaint filed Thursday, Tawainna Anderson, of Pennsylvania, said her daughter Nylah died last year after she asphyxiated when attempting to perform the so-called “Blackout Challenge” that encourages people to record themselves holding their breath or choking until they pass out, according to the initial complaint. The mother said she rushed her to a local hospital Dec. 7, but that she died Dec. 12 due to her injuries.

Advertisement

Court documents claim that the challenge was recommended to her through the algorithm that “determined that the deadly Blackout Challenge was well-tailored and likely to be of interest to 10-year-old Nylah Anderson.”

Advertisement

The Blackout Challenge has been reportedly running on the platform for years, but similar challenges have been part of the school playground environment for decades. Still, Tawainna’s death follows a rash of similar and heavily publicized TikTok choking incidents over the past few years. Another 10-year-old girl in Italy died after she attempted the challenge last January, and a 12-year old Colorado boy died in April, 2021 after attempting the challenge.

Advertisement

In an email statement, a Tiktok spokesperson said “​​This disturbing ‘challenge,’ which people seem to learn about from sources other than TikTok, long predates our platform and has never been a TikTok trend. We remain vigilant in our commitment to user safety and would immediately remove related content if found. Our deepest sympathies go out to the family for their tragic loss.”

The platform has explicit rules about content that advocates self harm. The app does have a curated version for users under 13 that limits personal details users are able to share, and limits their ability to comment or post content, but it’s unclear how automatic systems might restrict content from coming up in users’ feeds.

Advertisement

The platform is rated 12+ on both the Apple and Google app stores, but like most apps, all it takes to make an account is to claim you’re above the age limit. The company claimed it removed more than 15 million underage accounts last year.

During a press conference Thursday, one of Anderson’s attorney’s Bob Mangeluzzi said “TikTok is one of the most powerful and technologically advanced companies in the world, so what did TikTok do once it learned of this?... [they] used their app and algorithm to forward a blackout challenge video to a 10-year-old.”

Advertisement

The complaint describes the app’s algorithm being intentionally designed to “maximize user engagement and dependence” that encourages children to repeatedly engage. The lawsuit targets TikTok as the designers of the algorithm as distributors that promoted the content to Tawainna.

“It is time that these dangerous challenges come to an end,” Anderson said during the press conference. “Something has to change, something has to stop because I wouldn’t want any other parent to go through what I’m going through.”

Advertisement

This lawsuit is not the only one going after TikTok over allegations they promote dangerous content to children. In March, news broke that several state attorney generals are investigating whether TikTok is harmful to young adults, and whether the company is aware of the content younger users see.

TikTok has rapidly become one of the most popular social media platforms available, and it is expected to make more in advertising than Twitter or Snap combined.

Advertisement