To enhance online safety for younger users, Meta Platforms is rolling out new privacy and parental controls on Instagram.
Meta is transitioning all Instagram accounts for users under 18 to “Teen Accounts,” which will be private by default. This update, announced on Tuesday, will restrict messaging and tagging to users they follow or are already connected with, and will enforce the strictest settings for sensitive content.
Users under 16 will need parental consent to modify these default settings. Parents will also have tools to oversee their child’s interactions and regulate app usage.
Studies show that social media can worsen issues like depression, anxiety, and learning challenges among young users. Meta, along with TikTok and YouTube, is facing multiple lawsuits over allegations of addictive features. In 2022, 33 U.S. states, including California and New York, sued Meta for allegedly misleading the public about the dangers of its platforms.
Currently, Instagram, Facebook, and TikTok allow users as young as 13 to sign up. Meta’s new initiative follows its decision to halt development of a teen-focused Instagram app three years ago, due to safety concerns from lawmakers and advocacy groups.
In July, the U.S. Senate advanced two online safety bills—the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act—that seek to hold social media companies accountable for their impact on younger users.
The update will prompt Instagram users under 18 to take a break after 60 minutes of daily use and include a default sleep mode to silence notifications overnight. Meta plans to implement these Teen Accounts within 60 days in the U.S., UK, Canada, and Australia, with the EU to follow later this year. A global rollout is set to begin in January.