Skip to content
Link copied to clipboard

A Delco mom can sue TikTok for her daughter’s death during a ‘blackout challenge,’ federal appeals court rules

A district judge previously ruled that federal law provides TikTok with immunity for the harms caused by content it hosts on the platform.

Nylah Anderson, 10, in Chester, liked TikTok videos and she accepted the "blackout challenge" in personal TikTok feed last December as a fun dare. She asphyxiated herself. Her mother has sued TikTok in Philadelphia federal court.
Nylah Anderson, 10, in Chester, liked TikTok videos and she accepted the "blackout challenge" in personal TikTok feed last December as a fun dare. She asphyxiated herself. Her mother has sued TikTok in Philadelphia federal court.Read moreCourtesy of Nylah Anderson's family

The mother of a 10-year-old Chester girl who died attempting a viral “blackout challenge” inspired by videos she saw on TikTok can sue the social media platform for the role she says it played in her daughter’s death, a federal appeals court in Philadelphia ruled.

Federal law has for decades protected online publishers from suits over content posted to their platforms by others. But in its decision Tuesday, the U.S. Court of Appeals for the Third Circuit held that those protections don’t necessarily extend to the algorithms social media companies design to serve that content to their users.

“TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech,” Circuit Judge Patty Shwartz wrote for the three-judge panel that decided the case.

» READ MORE: 10-year-old Chester girl died doing a TikTok choking ‘challenge.’ Her mother is suing the video platform.

The ruling, which legal experts described as novel, could carry significant repercussions for social media companies, who for years have relied on those legal protections to fend off attempts to hold them liable for content they host.

Though courts have, in recent years, pushed back on those protections, the Third Circuit’s ruling appears to go further than prior decisions, said Amy Landers, director of the intellectual property law program at Drexel University,

“As far as I know, that is the furthest that any court has gone,” she said.

Neither TikTok, nor its owner ByteDance Inc., returned requests for comment Wednesday.

For Tawainna Anderson, mother of Nylah — who accidentally hanged herself with her mother’s purse in 2021 while mimicking the videos presented to her by TikTok’s “For You Page” — the ruling revives a lawsuit she filed after her daughter’s death.

“Nothing will bring back our beautiful baby girl. But we are comforted knowing that — by holding TikTok accountable — our tragedy may help other families avoid future, unimaginable suffering,” the Anderson family said in a statement through their lawyers.

The blackout challenge

The blackout challenge was a TikTok trend that gained traction in 2021, especially among children. Kids used household items to choke themselves until they blacked out, and then uploaded the video to the social media platform.

The deaths of at least 20 children — including 15 who were age 12 or younger — have been linked to the blackout challenge, according to Bloomberg Businessweek.

Among those children was Nylah, who liked to dance along to videos on TikTok.

Nylah attempted the blackout challenge in December 2021 after learning about it on her “For You Page,” in which TikTok recommends content. She hung her mother’s purse to a hanger in a closet at her home, placed her neck on the strap, and passed out.

When her mother found her, she was unconscious. Nylah died five days later in a children’s hospital.

By the time Nylah died, TikTok was aware of the problem of dangerous challenges on the platform. A month prior, the company shared the results of a global survey about teens’ use of the platform, part of an effort to “better understand how teens engage with potentially harmful challenges and hoaxes.” One in six teens said in the survey that they came across a challenge that was risky and either dangerous or very dangerous.

The lawsuit

Anderson sued TikTok and ByteDance in federal court in 2022, arguing that TikTok continued to allow blackout challenge videos after it knew it was causing harm. And to make things worse, the platform’s algorithm promoted the blackout challenge to teens through the curation of the “For You Page,” they said.

Within months, though, a U.S district judge tossed out the suit, citing the federal statutes — colloquially known as “Section 230″ — that protect platforms from suits over third-party content they host.

“Defendants did not create the challenge; rather, they made it readily available on their site,” Judge Paul Diamond wrote. “Defendants published that work — exactly the activity Section 230 shields from liability.”

In its ruling Tuesday, the Third Circuit disagreed. While agreeing that Anderson can’t sue TikTok for merely hosting the blackout challenge videos that led to her daughter’s death, the social media platform did more in Nylah’s case, Shwartz wrote.

» READ MORE: Malvern middle schoolers created more than 20 TikTok accounts that impersonated teachers and posted inappropriate content

TikTok’s algorithm actively promoted the video to Anderson through her “For You Page,” which tailors the content it serves based on demographics, metadata, and other information that isn’t exclusively based on choices users make about what videos they want to see.

That set-up amounts to an editorial decision, Shwartz wrote in the court’s opinion. TikTok’s algorithm, she said, is itself a form of “expressive content” for which the company can be sued.

”Nylah ... likely had no idea what she was doing or that following along with the images on her screen would kill her,” Circuit Judge Paul Matey wrote in partial concurrence. “But TikTok knew that Nylah would watch because the company’s customized algorithm placed the videos on her ‘For You Page.’”

Tuesday’s ruling sends Anderson’s suit back to the lower court, where she will now have to prove that TikTok’s algorithm played a role in her daughter’s death.

“She will be entitled to her day in court,” her attorney, Jeffrey Goodman, said.