Meta Announces New Policies to Put Teens Into Restrictive Facebook, Instagram Control Setting, Reduce Self-Harm Content Exposure

Meta announces new policies.

Meta announces new policies and features that can protect teens from self-harm posts, videos, and other content on Facebook and Instagram.

These efforts were announced by the American tech giant on Tuesday, Jan. 9. Meta said that it wants to ensure safe, age-appropriate experiences for teens as they access its social media platform.

Meta Puts Teens Into Most Restrictive Facebook, Instagram Control Setting To Reduce Self-Harm Content Exposure
This picture taken on January 12, 2023 in Toulouse, southwestern France shows a smartphone and a computer screen displaying the logos of the Instagram, Facebook, WhatsApp and their parent company Meta. LIONEL BONAVENTURE/AFP via Getty Images

The giant tech firm said that it already developed over 30 tools and resources that can protect teens and support their parents in protecting them.

Now, Meta confirmed new additional protections that are focused on the kinds of content that teenagers see on Instagram and Facebook.

Meta to Protect Teens From Self-Harm Content on Facebook, Instagram

According to Tech Crunch's latest report, Meta will automatically limit the kind of content that teenagers can access and see on Instagram and Facebook.

"We already apply this setting for new teens when they join Instagram and Facebook and are now expanding it to teens who are already using these apps," said Meta via its official blog post.

The tech firm explained its new content recommendation controls called Reduce (for Facebook) and Sensitive Content Control (for Instagram).

Meta said that these new content recommendation controls will make it more difficult for teenagers to come across content or accounts that are possibly harmful; specifically containing self-harm content.

Meta said that the new Reduce and Sensitive Content Control features will limit what younger users can access via Search and Explore.

What It Means for People Sharing Self-Harm Struggles

Meta Puts Teens Into Most Restrictive Facebook, Instagram Control Setting To Reduce Self-Harm Content Exposure
A photo of the META logo during the US social network Instagram opening on a tablet screen in Moscow on November 11, 2021. - Facebook chief Mark Zuckerberg announced the parent company's name is being changed to "Meta" to represent a future beyond just its troubled social network. KIRILL KUDRYAVTSEV/AFP via Getty Images

Meta clarified that it will still allow users to share their struggles with self-harm, suicide, and eating disorders on its social media platforms.

However, its policies don't recommend these kind of posts. This is why Meta is releasing new features and policies so that younger users will have a harder time seeing or finding them.

The social media giant said that it will now redirect users to expert resources for help whenever they search terms related to self-harm, eating disorders, and suicide.

"We already hide results for suicide and self-harm search terms that inherently break our rules and we're extending this protection to include more terms," explained Meta.

Meta confirmed that all these new features and policies will be rolled out for all users across the globe over the coming weeks.

Tags
Teens, Facebook, Instagram
Real Time Analytics