10-year-old Chester girl died doing a TikTok choking ‘challenge.’ Her mother is suing the video platform.
TikTok’s platform manipulates the behavior of children to maximize profits, the suit says. The firm said the blackout challenge “long predates our platform and has never been a TikTok trend.”
Nylah Anderson liked to dance along to TikTok and share the platform’s short videos. Then, in early December, TikTok’s “Blackout Challenge,” which dared people to choke themselves until they almost passed out, allegedly caught 10-year-old Nylah’s attention in her personalized TikTok feed.
Nylah hung her mother’s purse by strapping it to a hanger in the closet of her Chester home. She placed her head between the bag and the shoulder strap, putting pressure on her neck, according to court documents. The girl passed out before she could free herself and by the time her mother found her, Nylah was unconscious. The girl died five days later in a children’s hospital.
Nylah’s mother, Tawainna Anderson, sued TikTok and its parent company, ByteDance Inc., in federal court in Philadelphia late last week, accusing the social media platform of being a “dangerously defective” product whose algorithm fed the viral challenge to her daughter’s “For You Page,” leading to Nylah’s death.
TikTok’s platform, the suit claims, manipulates and controls the behavior of children to maximize profits and promote addiction while skirting responsibility for child safety.
A TikTok spokesperson in response called the blackout challenge “disturbing” but said that it “long predates our platform and has never been a TikTok trend.”
The spokesperson added that “we remain vigilant in our commitment to user safety and would immediately remove related content if found. Our deepest sympathies go out to the family for their tragic loss.”
Growing scrutiny for social media platforms
TikTok says that its social media platform is for those 13 and older, with an app for non-teens called TikTok for Younger Users. The Chinese-owned TikTok removed 15 million suspected underage users from its platform in the last three months of 2021, the company said.
The Anderson suit is part of the growing global attention to the physical and mental health impact to children and teens from platforms such as TikTok, Snapchat, and Instagram, which is owned by Meta Platforms, parent company of Facebook and Instagram. The engagement of young users has been popular with advertisers and fueled the growth of social media platforms, analysts say.
There’s “a major outcry and anger about their business practices,” said Jeff Chester, executive director of the Center for Digital Democracy, a digital rights group in Washington. “The platforms are placing the interest of children second to their own pursuit of profits.”
In October, Congress held hearings airing the potential dangers of Instagram to teens with features that tabulate the approval of other users with “like” numbers and follower counts.
Facebook’s internal documents were part of a leak of a trove of research by a former Facebook employee to the Wall Street Journal, which produced an investigative series that in turn led to the congressional probe. The series alleged in part that Instagram was aware of the threat to the mental health of teen girls — notably that it made them feel worse about their bodies.
Meta disputed that characterization, saying that Instagram was not “toxic” and released the internal research linked to the assertion.
In early March, a coalition of state attorneys general began investigating TikTok over whether it harms the physical and mental health of children and teens. Pennsylvania Attorney General Josh Shapiro is part of the coalition that is investigating the techniques TikTok uses “to boost young user engagement, including increasing the duration of time spent on the platform and frequency of engagement with the platform,” his office said.
Parents have taken action. In April, Wisconsin mother Donna Dawley sued Meta and Snap, parent company of image-sharing app Snapchat, for the January 2015 suicide of her son, Christopher J. Dawley, 17. He shot himself with a .22-caliber rifle as the family was packing away Christmas decorations. The federal suit claimed that as her son became addicted to social media, he was “progressively sleep deprived, and increasingly obsessed with his body image.”
Snap asked the judge to stay the proceedings, saying it could seek to transfer the case to the Los Angeles federal court, near its corporate headquarters in Santa Monica. The company told a Wisconsin judge that the defendants “are not legally responsible for this tragedy.”
In January, Tammy Rodriguez sued Meta, ByteDance, and Snap on behalf of her daughter Selena, 12, who allegedly developed an addiction to Instagram, Snapchat, and TikTok, leading to mental-health disorders and the exchange of sexually explicit messages. Tammy Rodriguez, of Enfield, Conn., found Selena unresponsive after the girl took her mother’s prescription pills in July. The girl died.
Snap is seeking to move this case, filed in San Francisco, to Los Angeles, court documents show.
“While we can’t comment on active litigation, our hearts go out to any family who has lost a loved one to suicide,” a Snap spokesperson said in response to both suits. “We intentionally built Snapchat differently than traditional social media platforms to be a place for people to connect with their real friends, and offer in-app mental health resources, including on suicide prevention for Snapchatters in need. Nothing is more important than the safety and well-being of our community and we are constantly exploring additional ways we can support Snapchatters.”
Meta did not respond to an email for comment on the Dawley and Rodriguez suits.
TikTok parent’s Philly-area connection
Susquehanna International Group, the Bala Cynwyd trading and investment firm co-founded by Jeff Yass, considered Pennsylvania’s wealthiest resident, was an early backer of TikTok’s parent company ByteDance, according to published reports.
The Guardian reported that TikTok’s advertising this year could overtake the combined advertising of Twitter and Snapchat and that its rapid growth and popularity with young users has alarmed Meta because of its Facebook and Instagram units.
Meta has paid one of the biggest Republican consulting firms, Targeted Victory, to orchestrate a nationwide campaign seeking to turn the public against TikTok, according to the Washington Post.
Employees with the firm, Targeted Victory, worked to undermine TikTok through a nationwide media and lobbying campaign portraying ByteDance as a danger to American children and society, according to internal emails shared with the news organization.
Meta spokesperson Andy Stone defended the campaign, saying “We believe all platforms, including TikTok, should face a level of scrutiny consistent with their growing success.” Targeted Victory declined to respond to questions in late March about the campaign, saying only that it has represented Meta for several years.
Philadelphia plaintiff’s firm Saltz Mongeluzzi & Bendesky brought the negligence, products liability and wrongful death suit on behalf of Nylah’s estate. Lead attorney Robert J. Mongeluzzi said the suit “is about child safety. This was a child that was forwarded a horribly dangerous video.”
Although the blackout challenge may have existed before TikTok, Mongeluzzi said, “Nylah did not get this from other social media. TikTok sent it to her.”
Jeffrey P. Goodman, a Saltz attorney, said that the Nylah Anderson suit had nothing to do with the Meta campaign against TikTok.
Nylah’s was not the only death allegedly influenced by the TikTok challenge. A 12-year-old Colorado boy and a 10-year-old Italian girl also died last year participating in the Blackout Challenge, the children’s parents said in media reports.
The 46-page suit lists about 20 other challenges on the TikTok platform it claims are dangerous.
Among them are the “dry scoop challenge,” which dares people to eat a heaping scoop of undiluted powder that goes into energy drinks.
The “Benadryl challenge” dares participants to consume so much of the histamine that they hallucinate.