Friday, September 20, 2024 - 3:22 pm
HomeLatest NewsInstagram to launch family-controlled 'teen accounts' after security criticism

Instagram to launch family-controlled ‘teen accounts’ after security criticism

The safety problems of minors on Instagram are in the crosshairs of the authorities. Former executives of the social network owned by Meta have revealed that the social network has allowed situations of sexual harassment against teenagers for years; In the United States, more than 40 states have filed a lawsuit against the platform, accusing it of being “harmful and addictive” for them; while within the EU, the European Parliament has also urged Brussels to act, with the Spanish government taking the first step.

Meta has taken note of these concerns and announced on Tuesday the launch of a new feature it calls “teen accounts.” These are profiles that are subject to a special set of restrictions when a minor under the age of 16 signs up for Instagram. To deactivate them, the minor will need the consent of their family. In addition, this configuration will allow you to establish more restrictive control measures, such as monitoring who you write to.

“Teen accounts have built-in protections that limit who can contact them and what content they see, while providing new ways for teens to explore their interests. We will automatically place teens on these accounts, and teens under 16 will need their parents’ permission to change these settings to make them less strict,” Meta said in a statement.

To manage this control, it will be necessary that at least one of your parents or guardians also has an Instagram account. The set of special tools will be tested from this Wednesday in the United Kingdom, the United States, Canada and Australia. In the European Union, they will be activated “at the end of the year,” the social media group announced.

Families will be able to see who they are chatting with

The restrictions on “teen accounts” affect a variety of features. One of them is the one that will allow families to see who their children are chatting with. “While parents can’t read their children’s messages, they will now be able to see who their child has been messaging in the last seven days,” Meta explains.

This measure will not be enabled by default, but it can be implemented by families if they wish. The same will happen with the possibility of setting closed time limits, which will allow you to designate how much time each teenager can use Instagram each day (for example, a maximum of one hour). “Once the teenager reaches this limit, they will no longer be able to access the application,” explains Meta. It will also be possible to block the use of the social network during specific periods, such as during school hours or at night.

These three options must be set by their families from the teenager’s account control panel. Among the restrictions that will be enabled by default, some will limit who can see what they post or interact with them. For example, children under 16 will have a “private account,” which means that people who do not follow them will not be able to see their content, comment on it, like it or send them messages.

“We will also automatically enable the most restrictive version of our anti-bullying feature, Hidden Words, so that offensive words and phrases are filtered out of comments and direct message requests aimed at teens,” Meta added.

Finally, teen accounts will also have a content restriction that will limit what they see in Reels (Instagram’s vertical video streaming tool, similar to TikTok) or the Explore section. “Teens will automatically be placed on the most restrictive setting of our sensitive content control, which limits the type of sensitive content (like that depicting fights or promoting cosmetic procedures),” Meta explains.

Growing pressure on Instagram

In recent years, Instagram has come under increasing pressure due to its effects on the mental health of adolescents. Several surveys have revealed the negative impact of this social network on the self-image of young people, particularly teenage girls, increasing problems such as anxiety and depression. Former Meta workers have assured that they were aware of these problems, but continued to prioritize obtaining benefits and keeping adolescents active rather than implementing protective measures such as those announced today.

Lawsuits in the US accuse the company of exploiting teens’ vulnerabilities through addictive algorithms, which are increasing mental health problems among young people. Prosecutors in states including California and Colorado say platforms like Instagram are comparable to tobacco in terms of their addictive potential and health risks to minors, intensifying calls for greater regulation.

In Spain, the government approved a bill aimed at increasing the protection of adolescents on the networks, such as increasing from 14 to 16 the minimum age necessary to register on one of these platforms without parental authorization. Another of the measures included was something similar to what Instagram is currently launching: “factory” parental control systems on all devices.

Source

Jeffrey Roundtree
Jeffrey Roundtree
I am a professional article writer and a proud father of three daughters and five sons. My passion for the internet fuels my deep interest in publishing engaging articles that resonate with readers everywhere.
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Recent Posts