For the first time, internal TikTok communications have been made public that show a company unconcerned with the harms the app poses for American teenagers. This is despite its own research validating many child safety concerns.
The confidential material was part of a more than two-year investigation into TikTok by 14 attorneys general that led to state officials suing the company on Tuesday. The lawsuit alleges that TikTok was designed with the express intention of addicting young people to the app. The states argue the multi-billion-dollar company deceived the public about the risks.
In each of the separate lawsuits state regulators filed, dozens of internal communications, documents and research data were redacted — blacked-out from public view — since authorities entered into confidentiality agreements with TikTok.
But in one of the lawsuits, filed by the Kentucky Attorney General’s Office, the redactions were faulty. This was revealed when Kentucky Public Radio copied-and-pasted excerpts of the redacted material, bringing to light some 30 pages of documents that had been kept secret.
After Kentucky Public Radio published excerpts of the redacted material, a state judge sealed the entire complaint following a request from the attorney general’s office “to ensure that any settlement documents and related information, confidential commercial and trade secret information, and other protected information was not improperly disseminated,” according to an emergency motion to seal the complaint filed on Wednesday by Kentucky officials.
NPR reviewed all the portions of the suit that were redacted, which highlight TikTok executives speaking candidly about a host of dangers for children on the wildly popular video app. The material, mostly summaries of internal studies and communications, show some remedial measures — like time-management tools — would have a negligible reduction in screen time. The company went ahead and decided to release and tout the features.
Separately, under a new law, TikTok has until January to divest from its Chinese parent company, ByteDance, or face a nationwide ban. TikTok is fighting the looming crackdown. Meanwhile, the new lawsuits from state authorities have cast scrutiny on the app and its ability to counter content that harms minors.
In a statement, TikTok spokesman Alex Haurek defended the company’s child safety record and condemned the disclosure of once-public material that has now been sealed.
“It is highly irresponsible of NPR to publish information that is under a court seal,” Haurek said. “Unfortunately, this complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.”
He continued: “We have robust safeguards, which include proactively removing suspected underage users, and we have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16.”
Kentucky AG: TikTok users can become ‘addicted’ in 35 minutes
As TikTok’s 170 million U.S. users can attest, the platform’s hyper-personalized algorithm can be so engaging it becomes difficult to close the app. TikTok determined the precise amount of viewing it takes for someone to form a habit: 260 videos. After that, according to state investigators, a user “is likely to become addicted to the platform.”
In the previously redacted portion of the suit, Kentucky authorities say: “While this may seem substantial, TikTok videos can be as short as 8 seconds and are played for viewers in rapid-fire succession, automatically,” the investigators wrote. “Thus, in under 35 minutes, an average user is likely to become addicted to the platform.”
Another internal document found that the company was aware its many features designed to keep young people on the app led to a constant and irresistible urge to keep opening the app.
TikTok’s own research states that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety,” according to the suit.
In addition, the documents show that TikTok was aware that “compulsive usage also interferes with essential personal responsibilities like sufficient sleep, work/school responsibilities, and connecting with loved ones.”
TikTok: Time-limit tool aimed at ‘improving public trust,’ not limiting app use
The unredacted documents show that TikTok employees were aware that too much time spent by teens on social media can be harmful to their mental health. The consensus among academics is that they recommend one hour or less of social media usage per day.
The app lets parents place time limits on their kids’ usage that range from 40 minutes to two hours per day. TikTok created a tool that set the default time prompt at 60 minutes per day.
Internal documents show that TikTok measured the success of this tool by how it was “improving public trust in the TikTok platform via media coverage,” rather than how it reduced the time teens spent on the app.
After tests, TikTok found the tool had little impact – accounting for about a 1.5-minute drop in usage, with teens spending around 108.5 minutes per day beforehand to roughly 107 minutes with the tool. According to the attorney general’s complaint, TikTok did not revisit this issue.
One document shows one TikTok project manager saying, “Our goal is not to reduce the time spent.” In a chat message echoing that sentiment, another employee said the goal is to “contribute to DAU [daily active users] and retention” of users.
TikTok has publicized its “break” videos, which are prompts to get users to stop endlessly scrolling and take a break. Internally, however, it appears the company didn’t think the videos amounted to much. One executive said that they are “useful in a good talking point” with policymakers, but “they’re not altogether effective.”
Document: TikTok demoted people it deemed unattractive on its feed
The multi-state litigation against TikTok highlighted the company’s beauty filters, which users can overlay on videos to make themselves look thinner and younger or to have fuller lips and bigger eyes.
One popular feature, known as the Bold Glamour filter, uses artificial intelligence to rework people’s faces to resemble models with high cheekbones and strong jawlines.
Image created by NPR’s Grace Widyatmadja/TikTok
TikTok is aware of the harm these beauty filters can cause young users, the documents show.
Employees suggested internally the company “provide users with educational resources about image disorders” and create a campaign “to raise awareness on issues with low self esteem (caused by the excessive filter use and other issues).”
They also suggested adding a banner or video to the filters that included “an awareness statement about filters and the importance of positive body image/mental health.”
This comes as the documents showcase another hidden facet of TikTok’s algorithm: the app prioritizes beautiful people.
One internal report that analyzed TikTok’s main video feed saw “a high volume of … not attractive subjects” were filling everyone’s app. In response, Kentucky investigators found that TikTok retooled its algorithm to amplify users the company viewed as beautiful.
“By changing the TikTok algorithm to show fewer ‘not attractive subjects’ in the For You feed, [TikTok] took active steps to promote a narrow beauty norm even though it could negatively impact their Young Users,” the Kentucky authorities wrote.
TikTok exec: algorithm could deprive kids of opportunities like ‘looking at someone in the eyes’
Publicly, TikTok has stated that one of its “most important commitments is supporting the safety and well-being of teens.”
Yet internal documents paint a very different picture, citing statements from top company executives who appear well-aware of the harmful effects of the app without taking significant steps to address it.
One unnamed TikTok executive put it in stark terms, saying the reason kids watch TikTok is because of the power of the app’s algorithm, “but I think we need to be cognizant of what it might mean for other opportunities,” said the company executive. “And when I say other opportunities, I literally mean sleep, and eating, and moving around the room, and looking at someone in the eyes.”
TikTok’s internal estimate: 95% of smartphone users under 17 use TikTok
TikTok views itself as being in an “arms race for attention,” according to a 2021 internal presentation.
And teenagers have been key to the app’s early growth in the U.S., but another presentation shown to top company officials revealed that an estimated 95% of smartphone users under 17 use TikTok at least once a month. This lead a company staffer to state that it had “hit a ceiling among young users.”
TikTok’s own research concluded that kids were the most susceptible to being sucked into the app’s infinitely flowing feed of videos. “As expected, across most engagement metrics, the younger the user, the better the performance,” according to a 2019 TikTok document.
In response to growing national concern that excessive social media use can increase the risk of depression, anxiety and body-image issues among kids, TikTok has introduced time-management tools. These include notifications informing teens about how long they are spending on the app, parental oversight features and the ability to make the app inaccessible for some down time.
At the same time, however, TikTok knew how unlikely it was these tools would be effective, according to materials obtained by Kentucky investigators.
“Minors do not have executive function to control their screen time, while young adults do,” read a TikTok internal document.
TikTok pushes users into filter bubbles like ‘painhub’ and ‘sadnotes’
TikTok is well aware of “filter bubbles.” Internal documents show the company has defined them as when a user “encounters only information and opinions that conform to and reinforce their own beliefs, caused by algorithms that personalize an individual’s online experience.”
The company knows the dangers of filter bubbles. During one internal safety presentation in 2020, employees warned the app “can serve potentially harmful content expeditiously.” TikTok conducted internal experiments with test accounts to see how quickly they descend into negative filter bubbles.
“After following several ‘painhub’ and ‘sadnotes’ accounts, it took me 20 mins to drop into ‘negative’ filter bubble,” one employee wrote. “The intensive density of negative content makes me lower down mood and increase my sadness feelings though I am in a high spirit in my recent life.”
Another employee said, “there are a lot of videos mentioning suicide,” including one asking, “If you could kill yourself without hurting anybody would you?”
In another document, TikTok’s research found that content promoting eating disorders, often called “thinspiration,” is associated with issues such as body dissatisfaction, disordered eating, low self-esteem and depression
Despite these heedings, TikTok’s algorithm still puts users into filter bubbles. One internal document states that users are “placed into ‘filter bubbles’ after 30 minutes of use in one sitting.” The company wrote that having more human moderators to label content is possible, but “requires large human efforts.”
TikTok’s content moderation missing self-harm, eating disorder content
TikTok has several layers of content moderation to weed out videos that violate its Community Guidelines. Internal documents show that the first set of eyes aren’t always a person from the company’s Trust and Safety Team.
The first round typically uses artificial intelligence to flag pornographic, violent or political content. The following rounds use human moderators, but only if the video has a certain amount of views, according to the documents. These additional rounds often fail to take into account certain types of content or age specific rules.
According to TikTok’s own studies, the unredacted filing shows that some suicide and self-harm content escaped those first rounds of human moderation. The study points to self-harm videos that had more than 75,000 views before TikTok identified and removed them.
TikTok also has scattershot policies on content that includes disordered eating, drug use, dangerous driving, gore and violence. While TikTok’s Community Guidelines prohibit much of this content, internal policy documents say the company “allows” the content. Often, the content is findable on TikTok and just not “recommended,” meaning it doesn’t show up in users’ For You feeds or took a lower priority in the algorithm.
The company has talking points around its content moderation work. One example highlighted in the documents details a child sent to the emergency room after attempting a dangerous TikTok challenge. When dealing with the negative fallout from the press, TikTok told employees to use an internal list of talking points that said, “In line with our Community Guidelines, we do not allow content that depicts, promotes, normalizes, or glorifies [dangerous] behavior, including dangerous challenges.”
TikTok acknowledges internally that it has substantial “leakage” rates of violating content that’s not removed. Those leakage rates include: 35.71% of “Normalization of Pedophilia;” 33.33% of “Minor Sexual Solicitation;” 39.13% of “Minor Physical Abuse;” 30.36% of “leading minors off platform;” 50% of “Glorification of Minor Sexual Assault;” and “100% of “Fetishizing Minors.”
TikTok slow to remove users under 13, despite company policy
Kids under 13 cannot open a standard TikTok account, but there is a “TikTok for Younger Users” service that the company says includes strict content guardrails.
It is a vulnerable group of users, since federal law dictates that social media sites like TikTok cannot collect data on children under 13 unless parents are notified about the personal information collected. And even then, social media apps must first obtain verifiable consent from a parent.
In August, the Department of Justice sued TikTok for violating the federal law protecting the data of kids under 13, alleging that the app “knowingly and repeatedly violated kids’ privacy.”
In the internal documents, however, company officials instructed TikTok moderators to use caution before removing accounts of users suspected to be under 13.
An internal document about “younger users/U13” says TikTok instructs its moderators to not take action on reports of underage users unless their account identifies them as under 13.
The previously-redacted portions of the suit suggest the company is aware these young users have accounts – through complaints from parents and teachers — but does little to remove them.
TikTok in crisis mode after report on TikTok Live being ‘strip club filled with 15-year-olds’
After a 2022 report on Forbes about underage kids stripping on TikTok’s live feature, the company launched its own investigation.
That’s when TikTok officials realized there was “a high” number of underage streamers receiving digital currency on the app in the form of a “gift” or “coin” in exchange for stripping — real money converted into a digital currency often in the form of a plush toy or a flower.
TikTok discovered “a significant” number of adults direct messaging underage TikTokkers about stripping live on the platform.
As part of this internal probe, TikTok officials found that in just one month, 1 million “gifts” were sent to kids engaged in “transactional” behavior.
In an understated assessment, one TikTok official concluded: “[O]ne of our key discoveries during this project that has turned into a major challenge with Live business is that the content that gets the highest engagement may not be the content we want on our platform.”