As TikTok fights 14 attorneys general over the mental impact of the platform on children, unredacted documents have revealed exactly how much the company’s executives know.
The damming material reveals that the company not only knew about the impacts from its own research but put profits ahead of this.
This revelation is likely to be the first of many from the lawsuits for which 14 states have joined together to take the social media platform to task for its impact on the mental health of children and teenagers.
Knowingly Causing Damage
The unredacted information is from the lawsuit being fought by the Kentucky Attorney General’s Office and a fault has made it public. The majority of the information from the lawsuits remains hidden.
Thirty pages of notes were copied and pasted by Kentucky Public Radio and were then picked up by NPR. The revelations are nothing short of blood-chilling.
This just in! View
the top business tech deals for 2024 👨💻
They show that TikTok’s own research found that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.”
More than that, says NPR, the company knew that usage could have an impact on sleep, schoolwork and even “connecting with loved ones.” The lawsuit suggests it takes less than 35 minutes for users to become addicted.
Safety Measures Ineffective
TikTok is supposed to be inaccessible to anyone under 13 years of age, but the lawsuit alleges that the app does not utilize age verification software. Moreover, moderators were told not to remove users reportedly under this age unless their account specifically says that they are 13 years old or under.
It also takes aim at the measures TikTok has put into place for teenagers, which include warnings when their usage hits certain levels. The material published suggests that these nudges have a “negligible impact.”
It quotes that TikTok’s own research revealed that the average time per day that teens spent on the platform went from 108.5 minutes to about 107 minutes when it deployed the prompt.
“TikTok measured the success of the tool, however, not by whether it actually reduced the time teens spent on the platform to address this harm, but by three unrelated ‘success metrics,’ the first of which was ‘improving public trust in the TikTok platform via media coverage.’” – the lawsuit against TikTok.
It also states that TikTok did absolutely nothing to then change the design of the tool to deal with this excessive usage.
The Younger, the Better
Even more callous is the suggestion from the lawsuit that TikTok did nothing because it wanted this engagement. One document in the lawsuit pointed out:
“Across most engagement metrics, the younger the user, the better the performance.”
Suffice to say, TikTok is going to be in some serious hot water over these discoveries.
Negative Spirals
The unredacted material also points to TikTok internal research about “negative filter bubbles” – when the algorithm will keep sending children towards potentially damaging material if they follow certain accounts.
The lawsuit documents talk specifically about painful (“painhub”) and sad (“sadnotes”) content, including “thinspiration” and found that it takes only 30 minutes in one sitting before users are placed into one of these filter bubbles.
Exposed to Predators
The lawsuit documents point to another TikTok internal investigation that found that underage girls were being given TikTok gifts and “coins” in exchange for live stripping.
“The existence of these virtual rewards greatly increases the risk of adult predators targeting adolescent users for sexual exploitation.”
It adds that the company knew that one million “gifts” were sent to kids engaged in “transactional” behavior in just one month.
Young users were also being shown content that would “normalize pedophilia, glorify minor sexual assault and physical abuse.”
TikTok Claims Misrepresentation
TikTok has responded by claiming that the lawsuit “cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.”
Spokesman Alex Haurek told the press that the platform has “robust safeguards, which include proactively removing suspected underage users” and that it has “voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16.”