Grieving families sue TikTok, blaming it for the deaths of their children


The TikTok algorithm that surfaces one ridiculous and entertaining video after another keeps users glued to their screens and addicted to the wildly popular Facebook rival.


However, that same algorithm can also pose a danger to the app’s youngest users. It can do so by surfacing content that could put their lives at risk.

That’s the claim at the center of multiple lawsuits filed against TikTok. The suits were brought by families of young children who died after attempting the blackout challenge, one of many trends that have gone viral on the app. Sadly, all of the children at the center of these suits are younger than 15. And one of the suits points to seven children who died last year as a result of the blackout challenge.

TikTok Blackout Challenge

We should add that this news comes two years after the Trump administration tried to engineer a ban of TikTok from the US on national security grounds. Trump officials thought the app’s China-based parent company represented a threat. In response, the company insisted that its US arm was sufficiently walled off from the corporate mothership. The Trump administration’s fight ultimately got waylaid because of the Covid pandemic.


It turns out, though, that the claims from TikTok weren’t true. Some portion of US TikTok user data has in fact been accessible in China. And an FCC commissioner, as a result, is now calling on Google and Apple to take action here. Specifically, to follow their respective App Store policies and give TikTok the boot from those app marketplaces.

Meanwhile, TikTok’s bad press keeps multiplying. Against that backdrop comes news that the app may have motivated several young people to partake in a viral challenge that led to their deaths. The blackout challenge, according to one complaint, “encourages users to choke themselves with belts, purse strings, or anything similar until passing out.”

The company’s response

In response to the blackout challenge videos proliferating on the app, TikTok has said it now blocks users from searching for them. A safety message/warning screen appears, instead.

Importantly, though, at least some of the lawsuits against TikTok over this issue say the children who died never search for the videos proactively. That, actually, Twitter’s algorithm put it in front of their eyes on its own, via the app’s For You page.


Source: Andy Mee

1 view0 comments