Digital Focus

Exposed: TikTok’s Algorithm Pushes Porn to Children Despite Safety Promises

Source

A new investigation by human rights campaign group Global Witness has revealed that TikTok’s algorithm is recommending pornography and highly sexualised content to accounts registered as 13-year-olds, even when safety settings are fully enabled.

Researchers from Global Witness created four fake child accounts between late July and early August, turning on TikTok’s “restricted mode,” which is supposed to filter out adult themes.

Despite this, they were soon met with explicit search suggestions and videos depicting sexual acts, including pornographic material involving penetrative sex.

The report found that these explicit videos were often embedded within otherwise innocent content, an apparent attempt to evade content moderation systems.

“TikTok isn’t just failing to prevent children from accessing inappropriate content, it’s suggesting it to them as soon as they create an account,” said Ava Lee of Global Witness, calling the discovery a “huge shock.”

TikTok claims to have taken immediate action after being alerted, stating that it has over 50 safety features to protect younger users and removes 90% of policy-violating content before it’s viewed.

However, when Global Witness repeated the experiment after the UK’s Online Safety Act Children’s Codes came into force in July, the same issues persisted, raising serious concerns about the app’s compliance with new child protection standards.

Under the Children’s Codes, online platforms are now legally required to implement robust age assurance measures and ensure their algorithms do not expose children to harmful or sexual content.

This investigation underscores an urgent need for stronger regulatory oversight and accountability in digital spaces, reaffirming every child’s right to protection from sexual exploitation and harmful online content, as guaranteed.

Read more about the article here

Image Source

Show More

Related Articles

Back to top button