More than a dozen states, led by California and New York, are suing TikTok for allegedly duping the public about the safety of the popular video app, claiming it was deliberately designed to keep young people hooked on the service.
The lawsuits, filed separately on Tuesday in 13 states and the District of Columbia, argue TikTok has violated consumer protection laws and contributed to a teen mental health crisis.
The bipartisan group of attorneys general is seeking to force TikTok to change product features that they argue are manipulative and harm teens. The suits are also asking courts to impose financial penalties on the company.
It is the latest headache for TikTok, which is trying to fend off a U.S. ban of the app slated to start Jan. 19, unless the company severs ties with ByteDance, its China-based parent company.
Used by half of America, TikTok will now be defending itself against a barrage of state lawsuits that tap into growing national unease with the design of social media platforms and questions over whether the over-use of social media contributes to mental health problems like depression and body issues.
While pinpointing the exact role social media plays in worsening mental health issues is complicated, the state authorities claim TikTok is prioritizing the company’s growth and profits over child safety.
“TikTok intentionally targets children because they know kids do not yet have the defenses or capacity to create healthy boundaries around addictive content,” said California Attorney General Rob Bonta in a statement. “TikTok must be held accountable for the harms it created in taking away the time — and childhoods — of American children.”
Last year, a coalition of states filed a similar lawsuit against Instagram and Facebook owner Meta, accusing the tech giant of failing to keep children safe on the popular apps. Those cases are still pending.
TikTok, like most social media apps, tries to keep users as engaged as possible. But the attorneys general say features like its hyper-personalized algorithm, the ability to scroll endlessly and the app’s use of push notifications encourage excessive use that can lead to emotional and behavioral changes. The states say the company has downplayed the negative effects of the alleged dependence the app creates in an effort to boost its bottom line.
State investigators have been probing TikTok for more than two years, uncovering reams of internal communications from the company. In them, some TikTok staffers compared the app's algorithm to the addictive nature of slot machines.
In an interview with NPR, California's Bonta said TikTok's internal documents reveal that the company misled the public about what it knew about the addictive and other harmful effects of the platform.
"One TikTok executive referred to American teens as ‘the golden audience,’ and also stated 'It’s better to have young people as an early adopter,'" Bonta said. "They deployed a suite of manipulative features that exploited young people's psychological vulnerabilities."
The lawsuits particularly spotlight TikTok’s use of beauty filters, which can let users appear thinner and younger, or apply virtual makeup to a face using AI.
“Beauty filters have been especially harmful to young girls,” New York Attorney General Letitia James wrote in a statement. “Beauty filters can cause body image issues and encourage eating disorders, body dysmorphia, and other health-related problems.”
New York investigators argue TikTok failed to warn teens about these harms, and instead pushed beauty filters to its youngest users to keep them on the app longer.
Adolescents are left with growing feelings of inadequacy and self-doubt after using TikTok, the state attorneys say.
The District of Columbia’s suit alleges that TikTok traps teens in online bubbles that “bombard them with precisely the kinds of content that TikTok claims not to allow, including videos about weight-loss, body-image, and self-harm content.”
TikTok’s live-streaming feature is also abused, the state attorneys allege. In particular, they contend that thousands of underage users have hosted live-streamed videos where users can pay to send the streamer money in the form of TikTok “gifts,” a type of digital currency. This has been used to incentivize the sexual exploitation of children, according to the lawsuits.
TikTok spokesman Alex Haurek said the accusations in the lawsuits are misleading.
"We provide robust safeguards, proactively remove suspected underage users, and have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16," Haurek said.
Haurek noted that the lawsuit follow more than two years of negotiations with the attorneys general.
"It is incredibly disappointing they have taken this step rather than work with us on constructive solutions to industrywide challenges."
The suits will now proceed in 14 separate state courts. State officials say the cases will not be merged, since each complaint relies on specific state consumer protection laws. Individual trial dates will eventually be set months or years from now, unless the cases are dismissed, or settlements are reached.
In response to concerns, many social media apps, including TikTok, have beefed up child safety tools.
Last month, Meta announced a host of new features that enhance parental supervision on Instagram and made all teen accounts private in an effort to shield young people from interacting with potential predators.
Likewise, young TikTok users cannot send direct messages and their accounts are set to private by default to limit their exposure to people they do not know. The app also uses screen-time reminders to nudge users about how long they have been scrolling.
Yet in their suits, the states dismiss TikTok’s safety measures as meaningless public relations stunts, arguing that the company has done a lackluster job of verifying users’ identities when opening accounts, allowing adolescents to lie about their age and circumvent child safety measures.
Bonta, the California attorney general, wrote in a statement that TikTok’s features to protect children “do not work as advertised.”
“The harmful effects of the platform are far greater than acknowledged,” he continued, “and TikTok does not prioritize safety over profit.”
In addition to New York, California and the District of Columbia, the states suing TikTok include Illinois, Kentucky, Louisiana, Massachusetts, Mississippi, North Carolina, New Jersey, Oregon, South Carolina, Vermont and Washington.
Copyright 2024 NPR