fbpx

TikTok introduces new ways to filter out mature or ‘potentially problematic’ videos

TikTok introduces new ways to filter out mature or ‘potentially problematic’ videos
Image Credits: TechCrunch

Amid the rising controversy about the harmful effects of social media on teens, TikTok has announced several updates this Wednesday intended to help users moderate their viewing preferences and filter out content that they perceive to be “mature” or “potentially problematic if viewed repeatedly”. 

As cited by Cormac Keenan, head of trust and safety at the company, the topics include dieting, extreme fitness, and sadness.

The changes are a response to an investigation launched by a coalition of state attorneys general earlier this year into TikTok’s impact on young Americans. In a statement at the time, TikTok said it limits its features by age, provides tools and resources to parents, and designs its policies with the well-being of young people in mind.

The new TikTok safeguards will prescribe a “maturity score” to videos, building a system that organizes content based on thematic maturity, similar to the system used in film and television. The goal, according to Keenan, is “to help prevent content with overtly mature themes from reaching audiences between ages 13-17.”

In the Wednesday blog post, Keenen said the company is “focused on further safeguarding the teen experience” and will add a new functionality to provide more-detailed content filtering options in the coming weeks.

If you see something out of place or would like to contribute to this story, check out our Ethics and Policy section.