The Chinese social network TikTok confirmed this Wednesday to the American media The Verge that it would generally prohibit those under 18 from using appearance-altering effects, in reference to the beauty filtersa measure taken to protect your mental health.
TikTok said yesterday in a statement that the measure would be imposed “in the coming weeks” and that it was responding to a study it commissioned from the British NGO Internet Matters, published yesterday, which raises concerns about the the impact of these filters on the feeling of identity of minors.
The social network, which hosted a security forum in Dublin this week, said there was a “clear distinction” between filters designed to be “obvious and fun”, such as those which add animal features, and those Who “change your appearance” almost undetectable to anyone looking.
TikTok, in its statement on the forum, promised to limit the use of “certain appearance effects” to those under 18 and the head of public safety and well-being policies in Europe, Nikki Soo, confirmed that The measure will apply worldwidetakes over The Verge.
The platform says it already “proactively” notifies users when certain effects have been used in the content they see, but it will now give “more information about how an effect can change its appearance” and will help them understand “undesirable outcomes.” ” of these.
Last month, Fourteen US prosecutors denounced TikTok for harming the mental health of children and accused it of using an addictive content system to take advantage of young users, directly targeting the use of this type of filters.
Specifically, they reported that “Beauty filters” can lower self-esteemespecially that of younger girls, and cited studies that say 50% don’t look pretty without retouching their face and 77% say they try to change or hide part of their body with the tool.
TikTok revealed in its note that it has 175 million monthly users in Europe and that every month it deletes 6 million accounts created by children under 13 (its minimum age), which is why it collaborates with NGOs, legislators and regulators to strengthen its barriers, and deck using “machine learning” technology.
It also announced that in the coming weeks it would offer local helplines with experts to its users in 13 European countries when they “report content in the app regarding suicide, self-harm and harassment.” , which the platform already investigates and eliminates if it detects this. their policies were violated.
.