Today, Meta revealed that it’s implementing new direct message restrictions for teenagers on both Facebook and Instagram. Henceforth, all users under 16, and under 18 in some regions, will have the new restrictions applied to them by default.
Before this announcement, Instagram only restricted adults above 18 years from messaging teenagers who didn’t follow them back. Meta plans to notify current users about this change through a notification.
Additionally, Meta is enhancing its parental control features by providing parents with the power to permit or reject modifications to the default privacy settings made by teenagers. In the past, parents were merely notified when such changes were made without being given the ability to intervene.
Meta offered an instance to illustrate this point, stating that if a teenager tries to switch their account’s visibility from private to public, adjust the Sensitive Content Control setting from “Less” to “Standard”, or alter the rules on who can direct message them, these changes can be blocked by their parents.
The company is also preparing to debut a feature aimed at preventing teens from encountering undesirable and inappropriate images in DMs sent by their connections. This feature, according to the company, will be applicable even in end-to-end encrypted chats and will deter teens from sending such images. However, Meta did not provide any specific information on its efforts to ensure teen privacy during the implementation of these features, nor did it define what is considered “inappropriate”.
Earlier this month, Meta introduced new tools designed to prevent teenagers from viewing content related to self-harm or eating disorders on Facebook and Instagram.
Recently, European Union regulators formally requested information from Meta, seeking more details about the tech giant’s measures to prevent the distribution of self-generate child sexual abuse material (SG-CSAM).
Simultaneously, a civil lawsuit has been filed against the company in the New Mexico state court, alleging that Meta’s social network platform exposes teenagers to sexual content and targets underage accounts towards potential predators. In October, a lawsuit was lodged by over 40 US states in a California federal court, alleging that Meta’s products have been developed in a way that negatively affects children’s mental health.
In January 2022, representatives from Meta along with other social media networks like TikTok, Snap, Discord, and X (previously known as Twitter), are scheduled to testify before the Senate on matters related to child safety.
2 Comments
Pingback: Nigeria's Data Protection Regulator Probes into 17 Instances of Data Breaches - Innovation Village | Technology, Product Reviews, Business
Pingback: Meta is ending third-party access to the Facebook Groups API, leaving developers and customers in a state of upheaval - Innovation Village | Technology, Product Reviews, Business