TikTok to ban teenagers from using beauty filters

News Desk
November 28, 2024
Listen to article

TikTok will soon block users under the age of 18 from accessing beauty filters that alter their facial features, as the platform responds to mounting concerns about the impact on teenage mental health and self-esteem.

The restrictions, expected to roll out globally in the coming weeks, will target filters like “Bold Glamour,” which smooths skin, plumps lips, and reshapes eyes — effects often difficult to distinguish from reality. Filters intended for comedic purposes, such as those adding animal ears or exaggerated facial features, will remain accessible to teens.

The changes follow a report from Internet Matters, a children’s online safety non-profit, which found that these filters contribute to a "distorted worldview" by normalising perfected images. Many teenagers, particularly girls, reported feeling pressure to conform to these altered appearances, with some stating they viewed their unfiltered faces as “ugly” after prolonged use of the filters.

Dr Nikki Soo, TikTok’s Safety and Well-being Public Policy Lead for Europe, confirmed the rollout of the new restrictions, stating that the platform aims to "reduce the social pressure on young users" and promote healthier online habits.

The effectiveness of these restrictions will hinge on accurate age verification. TikTok plans to introduce automated systems using machine learning to detect users misrepresenting their age, part of a broader effort to remove underage users. Currently, the platform deletes approximately 20 million accounts every quarter for violating its age policy.

TikTok’s Chloe Setter, head of child safety public policy, acknowledged the challenges but maintained that the company would take a "safety-first approach," even if it leads to some users being wrongly blocked. Users will be able to appeal bans if they believe they were removed in error.

The policy changes coincide with tighter regulations in the UK under the upcoming Online Safety Act, which will require social media platforms to implement "highly effective" age checks. Ofcom, the UK’s communications regulator, has previously raised concerns over TikTok’s age verification measures, noting they were “yet to be established” as effective.

Richard Collard, associate head of policy for child safety online at the NSPCC, welcomed TikTok’s move but stressed that more needs to be done. “This is a positive step, but it’s just the tip of the iceberg. Other social media platforms must follow suit and implement robust age-checking systems,” he said.

Chloe Setter echoed these sentiments, stating that TikTok will continue to refine its safety measures as it faces increased scrutiny from regulators. The platform also plans to tighten its restrictions on under-13 users, a demographic that has historically been difficult to monitor.

Other platforms are taking similar steps. Roblox recently restricted younger users from accessing violent or explicit content, and Instagram, owned by Meta, has introduced “teen accounts” that allow parents to monitor and control their children’s activity on the app.

Andy Burrows, CEO of the Molly Rose Foundation, emphasised the role of regulation in prompting these changes. “It’s clear that platforms are only making these shifts to comply with incoming EU and UK regulations. This underscores the need for even more ambitious legislation to protect children online,” he said.

As the year draws to a close, TikTok’s global efforts to enforce age-appropriate experiences signal a broader shift in social media towards prioritising user safety — a trend expected to accelerate as new regulations take effect in 2025.

Recent posts