Uncategorized

TikTok hit by over a dozen state lawsuits for allegedly harming teens

Illustration by Nick Barclay / The Verge

Attorneys general from 14 states and districts sued TikTok for allegedly harming kids’ mental health and misleading the public about how safe its platform is.
The bipartisan group of AGs, led by New York’s Letitia James and California’s Rob Bonta, each filed lawsuits alleging violations of their own state’s law. “Our investigation has revealed that TikTok cultivates social media addiction to boost corporate profits,” Bonta said in a statement. “TikTok intentionally targets children because they know kids do not yet have the defenses or capacity to create healthy boundaries around addictive content.” James called the lawsuits part of an effort “to protect young people and help combat the nationwide youth mental health crisis.”
The suits argue TikTok violated the law by designing features and promoting content harmful to children. It’s a strategy that’s had some success in overcoming the liability shield of Section 230, which protects services from lawsuits over user speech. The AGs accuse TikTok of using addictive features that keep kids on the app longer, like autoplaying videos, promoting live content and stories that are only available temporarily, and offering beauty filters on videos. They also reference dangerous challenges that have gone viral on TikTok and in some cases been connected to teens’ deaths.
The enforcers allege this behavior runs afoul of a variety of laws. They claim TikTok violated the federal Children’s Online Privacy Protection Act (COPPA) by profiting off of data from kids under 13 due to “deficient policies and practices” that have allegedly knowingly let minors on the service. (The Department of Justice also recently accused TikTok of violating COPPA in a separate lawsuit.) And they say TikTok has misled the public about its safety for kids. The New York suit, for example, alleges TikTok violated state consumer protection law by marketing its 60-minute screen time limit as stricter than it actually is, because teens can enter a passcode to dismiss it. The suit also claims TikTok failed to warn users about the dangers of using beauty filters and misrepresenting its platform as not geared toward children despite having “child-directed” content.
The suits petition courts to stop the allegedly harmful behavior and grant financial penalties. Of course, TikTok is already facing an even more existential threat: the possibility of a ban in the US should it lose its challenge to a federal law and fail to divest from its Chinese parent company, ByteDance.
In a statement, TikTok spokesperson Alex Haurek said the company “strongly disagree[s] with these claims, many of which we believe to be inaccurate and misleading. We’re proud of and remain deeply committed to the work we’ve done to protect teens and we will continue to update and improve our product.” TikTok has tried to work with the group of AGs for more than two years, Haurek said, “and it is incredibly disappointing they have taken this step rather than work with us on constructive solutions to industrywide challenges.”
State AGs have taken a leading role in suing tech platforms for allegedly harming kids safety. New Mexico AG Raúl Torrez has sued both Snap and Meta for allegedly facilitating child predators on their platforms. And dozens of states filed suit against Meta last year for allegedly misleading the public about its harms to kids.

Illustration by Nick Barclay / The Verge

Attorneys general from 14 states and districts sued TikTok for allegedly harming kids’ mental health and misleading the public about how safe its platform is.

The bipartisan group of AGs, led by New York’s Letitia James and California’s Rob Bonta, each filed lawsuits alleging violations of their own state’s law. “Our investigation has revealed that TikTok cultivates social media addiction to boost corporate profits,” Bonta said in a statement. “TikTok intentionally targets children because they know kids do not yet have the defenses or capacity to create healthy boundaries around addictive content.” James called the lawsuits part of an effort “to protect young people and help combat the nationwide youth mental health crisis.”

The suits argue TikTok violated the law by designing features and promoting content harmful to children. It’s a strategy that’s had some success in overcoming the liability shield of Section 230, which protects services from lawsuits over user speech. The AGs accuse TikTok of using addictive features that keep kids on the app longer, like autoplaying videos, promoting live content and stories that are only available temporarily, and offering beauty filters on videos. They also reference dangerous challenges that have gone viral on TikTok and in some cases been connected to teens’ deaths.

The enforcers allege this behavior runs afoul of a variety of laws. They claim TikTok violated the federal Children’s Online Privacy Protection Act (COPPA) by profiting off of data from kids under 13 due to “deficient policies and practices” that have allegedly knowingly let minors on the service. (The Department of Justice also recently accused TikTok of violating COPPA in a separate lawsuit.) And they say TikTok has misled the public about its safety for kids. The New York suit, for example, alleges TikTok violated state consumer protection law by marketing its 60-minute screen time limit as stricter than it actually is, because teens can enter a passcode to dismiss it. The suit also claims TikTok failed to warn users about the dangers of using beauty filters and misrepresenting its platform as not geared toward children despite having “child-directed” content.

The suits petition courts to stop the allegedly harmful behavior and grant financial penalties. Of course, TikTok is already facing an even more existential threat: the possibility of a ban in the US should it lose its challenge to a federal law and fail to divest from its Chinese parent company, ByteDance.

In a statement, TikTok spokesperson Alex Haurek said the company “strongly disagree[s] with these claims, many of which we believe to be inaccurate and misleading. We’re proud of and remain deeply committed to the work we’ve done to protect teens and we will continue to update and improve our product.” TikTok has tried to work with the group of AGs for more than two years, Haurek said, “and it is incredibly disappointing they have taken this step rather than work with us on constructive solutions to industrywide challenges.”

State AGs have taken a leading role in suing tech platforms for allegedly harming kids safety. New Mexico AG Raúl Torrez has sued both Snap and Meta for allegedly facilitating child predators on their platforms. And dozens of states filed suit against Meta last year for allegedly misleading the public about its harms to kids.

Read More 

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top
Generated by Feedzy