**TikTok’s Algorithm Not Only Allows Children to Access Pornographic Content, It Actively Pushes Them Toward It, Report Finds**
A recent report released Friday by Global Witness, a nonprofit organization specializing in investigative research, reveals alarming issues with TikTok’s algorithm. According to the report, TikTok’s system not only permits children to access pornography and highly sexualized content but actively directs them toward such material.
Researchers posing as 13-year-olds created accounts on TikTok and quickly received recommendations for sexually explicit search suggestions, despite having safety settings for minors enabled. Some of the search bar suggestions on these fabricated children’s accounts included terms like “hardcore pawn clips” and “very very rude skimpy outfits,” leading to videos showing women simulating masturbation, flashing their underwear, or exposing their breasts.
At its most extreme, these search suggestions reportedly led to pornographic films depicting penetrative sex, according to Global Witness. The group also discovered that pornographic content had been edited into seemingly innocent videos to circumvent TikTok’s content moderation efforts. In one case, explicit content was just two clicks away after logging into the app—one click into the search bar and another on the suggested search term.
“Our point isn’t just that TikTok shows pornographic content to minors,” Global Witness stated. “It is that TikTok’s search algorithms actively push minors towards pornographic content.”
**TikTok’s Response and Further Developments**
Global Witness immediately reported these findings to TikTok. In response, a TikTok spokesperson told The Post: “As soon as we were made aware of these claims, we took immediate action to investigate them, remove content that violated our policies and launch improvements to our search suggestion feature.”
The spokesperson added that TikTok removes nine out of ten videos that violate its community guidelines, including bans on nudity and sexual content, before they are ever viewed.
This report follows recent developments after former President Trump signed an executive order approving the transfer of TikTok’s U.S. operations to a consortium of American-based investors.
Henry Peck, campaign strategy lead for digital threats at Global Witness, expressed surprise at the findings. “Global Witness typically doesn’t investigate children’s digital safety; instead, we focus on how big tech impacts discussions around human rights, democracy, and climate change,” he told The Post.
Peck explained that the researchers discovered the explicit content suggestions by chance while conducting unrelated research in April. Though TikTok responded and claimed to have taken immediate action, the sexualized search recommendations reappeared when Global Witness repeated the experiment in July and August.
Sexualized search recommendations continued to appear on the fake children’s accounts created in the UK on clean phones without any prior search history.
**TikTok’s Popularity Among Teens and User Complaints**
TikTok is especially popular among teenagers. According to a Pew Research Center survey, about six in ten teens say they use TikTok daily, with 16% reporting being on the app almost constantly.
Users themselves have voiced concerns about highly sexualized search suggestions. The report notes that several users posted screenshots of troubling search recommendations with captions such as “Can someone explain to me what is up w my search recs pls?” Others commented, writing, “I THOUGHT I WAS THE ONLY ONE,” and “How tf do you get rid of it like I haven’t even searched for it,” as well as “Same, what’s wrong with this app?”
Peck stated, “TikTok claims to have guardrails in place to make children and young people safe on its platform, yet we’ve discovered that moments after creating an account, they serve kids pornographic content.”
He added, “Now it’s time for regulators to step in.”
—
*This report highlights significant concerns about TikTok’s content moderation and its impact on minors. Parents and guardians are advised to monitor app usage closely and advocate for stronger safety measures.*
https://nypost.com/2025/10/03/business/tiktok-pushes-porn-sexual-content-to-13-year-olds-accounts-report/