A recent report has revealed that TikTok’s algorithm not only allows children to access pornography and highly sexualized content but actively pushes them toward it. The findings were published on Friday by Global Witness, a nonprofit organization known for investigative research.
Researchers posing as 13-year-olds created accounts on TikTok with safety settings for minors enabled. Despite these precautions, the accounts quickly received recommendations for sexually explicit search suggestions. Some of the suggested search terms included phrases such as “hardcore pawn clips” and “very very rude skimpy outfits,” which led to content featuring women simulating masturbation, flashing their underwear, or exposing their breasts.
At its most extreme, these search suggestions directed users to pornographic films depicting penetrative sex, according to the report. Global Witness also uncovered that pornographic clips had been edited into otherwise innocent videos to circumvent TikTok’s content moderation efforts.
In one instance, pornographic content was found just two clicks away after logging into the app — one click on the search bar and another on the suggested search term. “Our point isn’t just that TikTok shows pornographic content to minors. It is that TikTok’s search algorithms actively push minors towards pornographic content,” the report stated.
Global Witness immediately reported these findings to TikTok. A TikTok spokesperson responded, saying, “As soon as we were made aware of these claims, we took immediate action to investigate them, remove content that violated our policies, and launch improvements to our search suggestion feature.” The spokesperson also noted that TikTok removes nine out of 10 videos that violate its community guidelines—including bans on nudity and sexual content—before they are viewed by users.
This report comes shortly after former President Trump signed an executive order supporting the transfer of TikTok’s U.S. operations to a consortium of American-based investors.
Henry Peck, campaign strategy lead for digital threats at Global Witness, told The Post that the findings were a significant surprise for the organization. Global Witness typically focuses on big tech’s impact on human rights, democracy, and climate change, rather than children’s digital safety. However, their researchers stumbled upon the explicit content suggestions while conducting unrelated research in April.
Although TikTok claimed to have taken immediate action after being contacted, researchers repeated the experiment in July and August and continued to find sexual content recommendations on the fake children’s accounts. These accounts were created in the UK on clean devices with no prior search history, yet sexualized search recommendations persisted.
TikTok is particularly popular among teenagers, with about six in ten teens reporting daily usage and 16% saying they use the app “almost constantly,” according to Pew Research Center.
The new report also highlights user complaints about highly sexualized search suggestions. Several users posted screenshots of these troubling recommendations with captions like, “Can someone explain to me what is up with my search recs, please?” Others responded with comments such as, “I THOUGHT I WAS THE ONLY ONE,” “How do you get rid of it? I haven’t even searched for this,” and “What’s wrong with this app?”
“TikTok claims to have guardrails in place to make children and young people safe on its platform, yet we’ve discovered that moments after creating an account, they serve kids pornographic content,” Peck told The Post. He added, “Now it’s time for regulators to step in.”
https://nypost.com/2025/10/03/business/tiktok-pushes-porn-sexual-content-to-13-year-olds-accounts-report/