Meta, the parent company of Facebook, Instagram, and Messenger, is strengthening its commitment to teen safety by extending its “Teen Accounts” protection framework beyond Instagram to now include Facebook and Messenger. This move reflects Meta’s ongoing effort to create safer digital spaces for young users across its entire platform ecosystem.
Originally launched on Instagram in 2024, the “Teen Accounts” initiative introduced a suite of age-appropriate, built-in safety measures aimed at shielding teenagers from harmful content and interactions online. With its expansion to Facebook and Messenger, Meta is now applying the same level of rigor and oversight to its broader social media network.
New Protections Now Active on Facebook and Messenger
Under the updated framework, all new teen users on Facebook and Messenger will automatically be placed in private accounts. This setting ensures that only approved friends can see their content, helping to protect them from unwanted attention or exposure.
Meta is also limiting who can message teens, allowing only individuals that the teen already follows or has had prior contact with. This significantly reduces the chances of unsolicited or inappropriate messages from strangers—a key concern for parents and guardians.
Another major addition is a series of screen time management tools. Teens will now receive time limit notifications and be encouraged to take breaks from the app during nighttime hours through “quiet mode”. These features are designed to promote healthier digital habits and reduce potential overuse.
According to Meta, these protections are initially rolling out in the United States, United Kingdom, Canada, and Australia, with a broader international rollout planned in the near future.
Reinforcing Protections on Instagram
While the focus is on Facebook and Messenger, Meta is also adding new layers of protection on Instagram. Teens under 16 will now require parental consent before they can go live or disable certain safety settings.
In particular, young users will no longer be able to turn off the nudity filter in direct messages without a guardian’s approval. This tool automatically blurs potentially explicit images, offering a proactive layer of protection against inappropriate content.
Parental Involvement Is Central
Meta has emphasized that parents play a key role in helping teens navigate digital spaces safely. The platform is requiring parental approval for any significant adjustments to the default safety settings on all platforms.
So far, the response has been positive. Meta reports that 94% of parents support the Teen Accounts features, and 97% of teens aged 13 to 15 have retained the default restrictions. This suggests that both parents and teens find value in the protections and are willing to engage with the tools provided.
A Response to Growing Pressure
This expansion comes amid increasing scrutiny from regulators and child safety advocates who are calling on tech companies to take more responsibility for the mental health and well-being of young users.
By extending its teen safety measures across Facebook and Messenger, Meta is sending a clear message: protecting teenagers online is not optional—it’s essential. As these platforms continue to evolve, so too must the protections that come with them.