The Popular Video Platform Reportedly Leads Child Accounts to Pornographic Content In Just a Few Taps

As reported by a new study, the widely-used social media app has been found to guide children's accounts to explicit material in just a couple of steps.

Testing Approach

Global Witness set up test accounts using a birthdate of a 13-year-old and activated the "restricted mode" setting, which is intended to limit exposure to adult-oriented content.

Researchers found that TikTok proposed sexualized and explicit search terms to multiple test profiles that were set up on new devices with no previous activity.

Concerning Search Suggestions

Keywords proposed under the "recommended for you" feature included "provocative attire" and "explicit content featuring women" – and then progressed to phrases such as "explicit adult videos".

Regarding three of the accounts, the sexualized searches were recommended right away.

Rapid Access to Explicit Content

After a "small number of clicks", the study team encountered explicit material from women flashing to penetrative sex.

The organization stated that the content attempted to evade moderation, often by presenting the video within an benign visual or video.

In one instance, the process took two clicks after logging on: one interaction on the search feature and then another on the proposed query.

Regulatory Context

Global Witness, whose scope includes researching technology companies' influence on societal welfare, reported performing multiple testing phases.

One set occurred prior to the implementation of child protection rules under the British online safety legislation on the 25th of July, and another after the measures took effect.

Concerning Discoveries

The organization noted that several pieces of content showed someone who seemed to be below the age of consent and had been reported to the Internet Watch Foundation, which oversees harmful material involving minors.

The campaign group claimed that TikTok was in violation of the UK safety legislation, which obligates tech companies to block children from viewing inappropriate videos such as adult material.

Government Position

An official representative for Britain's media watchdog, which is responsible for overseeing the law, said: "We appreciate the research behind this study and will analyze its findings."

Ofcom's codes for following the act specify that digital platforms that carry a substantial threat of showing harmful content must "adjust their systems to filter out harmful content from young users' timelines.

The platform's rules prohibit explicit material.

Company Reaction

TikTok said that after being contacted from the organization, it had taken down the problematic material and introduced modifications to its recommendation system.

"As soon as we were made aware of these assertions, we acted promptly to examine the issue, take down videos that violated our policies, and introduce upgrades to our recommendation system," said a company representative.

Kevin Jordan
Kevin Jordan

A passionate historian and travel writer dedicated to uncovering the hidden gems of Italian cultural heritage.