Meta, the parent company of social media Instagram and Facebook, which is regularly accused of harming the mental health of young audiences, has decided to no longer allow advertisers to target teens based on their gender.
From February, groups wishing to advertise to minors on these platforms will only have access to their age and location to ensure that the content of the ads is appropriate and useful, explains Meta.Tuesday in a post on its website.
The company, led by Mark Zuckerberg, which has already stopped allowing advertisers to find out the history of teenagers on other sites since the summer of 2021, also decided to extend this restriction to its own platforms.
This is tantamount to “depriving advertisers of the ability to target teens based on their interests and activities,” Meta says.
The group also plans to make it easier for those under 18 to indicate when they want to receive less advertising on certain topics, such as a TV show genre or a specific sport.
U.S. elected officials and child protection associations have criticized apps especially popular among young people, such as Instagram, as well as Snapchat, Youtube or TikTok, for having a harmful effect on the youngest users.
The allegations took on a new dimension when, in the fall of 2021, former Facebook employee Frances Haugen released internal documents showing that platform executives were aware of certain risks to minors.
Since then, companies have tried to give guarantees to protect teenagers.
Insufficient effort, some say: Public school officials in Seattle, U.S., filed a complaint Friday against several social media outlets, accusing them of “attacks” on the mental health of minors.
“The rise in mental health-related suicides, suicide attempts and emergency room visits is not a coincidence. (…) This crisis was deepening even before the pandemic, and studies have shown that social networks play an important role in the emergence of mental illness. young people have problems,” they wrote in their complaint.