TikTok is facing serious allegations from a group of attorneys general who claim the platform has intentionally created features and promoted content that are harmful to children. These legal actions aim to pierce the protections typically granted under Section 230 of the Communications Decency Act, which shields platforms from liability for user-generated content.

The AGs argue that TikTok employs addictive features designed to keep kids glued to their screens. These include autoplay videos, live content, and temporary stories, all contributing to longer usage times. They also highlight dangerous viral challenges linked to tragic incidents, including fatalities among teens.

The lawsuits assert that TikTok's practices violate several laws. One major claim is that the platform has breached the federal Children’s Online Privacy Protection Act (COPPA) by allegedly profiting from data collected from users under 13, thanks to lax policies that allow minors to access the app. The Department of Justice has also filed a separate lawsuit accusing TikTok of COPPA violations.

Furthermore, the AGs contend that TikTok has misled the public regarding its safety for young users. For example, a lawsuit from New York alleges that TikTok falsely advertised its 60-minute screen time limit as more restrictive than it truly is, since teens can easily bypass it with a passcode. They also claim TikTok has downplayed the risks associated with beauty filters and misrepresented its platform as being safe for children, despite featuring content aimed at younger audiences.

The lawsuits seek to halt these harmful practices and impose financial penalties on TikTok. However, the platform is also facing an even bigger threat: the looming possibility of a ban in the U.S. if it fails to meet federal requirements or divest from its Chinese parent company, ByteDance.