Close Menu
Innovation Village | Technology, Product Reviews, Business
    Facebook X (Twitter) Instagram
    Wednesday, May 21
    • About us
      • Authors
    • Contact us
    • Privacy policy
    • Terms of use
    • Advertise
    • Newsletter
    • Post a Job
    • Partners
    Facebook X (Twitter) LinkedIn YouTube WhatsApp
    Innovation Village | Technology, Product Reviews, Business
    • Home
    • Innovation
      • Products
      • Technology
      • Internet of Things
    • Business
      • Agritech
      • Fintech
      • Healthtech
      • Investments
        • Cryptocurrency
      • People
      • Startups
      • Women In Tech
    • Media
      • Entertainment
      • Gaming
    • Reviews
      • Gadgets
      • Apps
      • How To
    • Giveaways
    • Jobs
    Innovation Village | Technology, Product Reviews, Business
    You are at:Home»Apps»Meta implements stronger restrictions on teen messaging and enhancing parental control features
    This picture taken on October 5, 2020 in Toulouse, southwestern France, shows logos of US social networks Facebook and Instagram on the screens of a tablet and a mobile phone. (Photo by Lionel BONAVENTURE / AFP) (Photo by LIONEL BONAVENTURE/AFP via Getty Images)

    Meta implements stronger restrictions on teen messaging and enhancing parental control features

    2
    By Tapiwa Matthew Mutisi on January 25, 2024 Apps, Facebook, Instagram, Meta, News, Regulation, Social Media, Technology

    Today, Meta revealed that it’s implementing new direct message restrictions for teenagers on both Facebook and Instagram. Henceforth, all users under 16, and under 18 in some regions, will have the new restrictions applied to them by default.

    Before this announcement, Instagram only restricted adults above 18 years from messaging teenagers who didn’t follow them back. Meta plans to notify current users about this change through a notification.

    Image Credits: Meta

    On the Messenger platform, users will only receive messages from individuals who are either Facebook friends or in their contact list.

    Additionally, Meta is enhancing its parental control features by providing parents with the power to permit or reject modifications to the default privacy settings made by teenagers. In the past, parents were merely notified when such changes were made without being given the ability to intervene.

    Meta offered an instance to illustrate this point, stating that if a teenager tries to switch their account’s visibility from private to public, adjust the Sensitive Content Control setting from “Less” to “Standard”, or alter the rules on who can direct message them, these changes can be blocked by their parents.

    Image Credits: Meta

    Meta initially introduced parental supervision tools for Instagram in 2022 to give guardians insights into their teenagers’ social media usage.

    The company is also preparing to debut a feature aimed at preventing teens from encountering undesirable and inappropriate images in DMs sent by their connections. This feature, according to the company, will be applicable even in end-to-end encrypted chats and will deter teens from sending such images. However, Meta did not provide any specific information on its efforts to ensure teen privacy during the implementation of these features, nor did it define what is considered “inappropriate”.

    Earlier this month, Meta introduced new tools designed to prevent teenagers from viewing content related to self-harm or eating disorders on Facebook and Instagram.

    Recently, European Union regulators formally requested information from Meta, seeking more details about the tech giant’s measures to prevent the distribution of self-generate child sexual abuse material (SG-CSAM).

    Simultaneously, a civil lawsuit has been filed against the company in the New Mexico state court, alleging that Meta’s social network platform exposes teenagers to sexual content and targets underage accounts towards potential predators. In October, a lawsuit was lodged by over 40 US states in a California federal court, alleging that Meta’s products have been developed in a way that negatively affects children’s mental health.

    In January 2022, representatives from Meta along with other social media networks like TikTok, Snap, Discord, and X (previously known as Twitter), are scheduled to testify before the Senate on matters related to child safety.

    Related

    apps Business Facebook Instagram Messaging Apps META parental control Restrictions social media Technology Teenagers
    Share. Facebook Twitter Pinterest LinkedIn Email
    Tapiwa Matthew Mutisi
    • Facebook
    • X (Twitter)
    • LinkedIn

    Tapiwa Matthew Mutisi has been covering blockchain technology, intelligent technologies, cryptocurrency, cybersecurity, telecommunications technology, sustainability, autonomous vehicles, and other topics for Innovation Village since 2017. In the years since, he has published over 4,000 articles — a mix of breaking news, reviews, helpful how-tos, industry analysis, and more. | Open DM on Twitter @TapiwaMutisi

    Related Posts

    Grok 3 Lands on Microsoft Azure: A Powerful New AI Model with Enterprise-Grade Muscle

    MTN Nigeria Sues 20 Banks Over SleekChip Debt

    Google Launches NotebookLM Mobile App, Bringing AI-Powered Research to Your Pocket

    2 Comments

    1. Pingback: Nigeria's Data Protection Regulator Probes into 17 Instances of Data Breaches - Innovation Village | Technology, Product Reviews, Business

    2. Pingback: Meta is ending third-party access to the Facebook Groups API, leaving developers and customers in a state of upheaval - Innovation Village | Technology, Product Reviews, Business

    Leave A Reply Cancel Reply

    You must be logged in to post a comment.

    Copyright ©, 2013-2024 Innovation-Village.com. All Rights Reserved

    Type above and press Enter to search. Press Esc to cancel.

     

    Loading Comments...
     

    You must be logged in to post a comment.