Instagram is reportedly testing a new feature that aims to block explicit images in direct message requests. The platform, with over 1 billion users worldwide, plans to roll out this feature once the trial phase is complete. The move comes in response to concerns raised regarding the platform’s failure to address abuse sent via direct messages, particularly towards women.
According to a Guardian reporter, the trial was initiated after research revealed a disturbing trend of unsolicited direct messages targeting high-profile women with abusive content. Meta, the parent company of Instagram, confirmed that users will no longer receive unsolicited images or videos from individuals they don’t follow.
Under the proposed feature, users will be limited to sending one direct message request to someone who doesn’t follow them, and this request will be limited to text-only communication. Only after the recipient accepts the request will the sender be able to send images or videos via direct message. The intention is to grant users more control over the content they receive and reduce the risk of explicit or offensive material.
The failure of Instagram’s existing tools to effectively address abuse sent via direct messages is a cause for concern. A significant percentage of reported abusive content, including violent threats and image-based sexual abuse, went unaddressed within the recommended 48-hour timeframe.
Women’s rights activists and civil society groups have long been campaigning for social media platforms to take misogyny and gender-based violence seriously, urging them to consider these acts as violations of community standards.
CCDH, alongside Ultraviolet, the Women’s Disinformation Defense Project, and numerous other organisations, have called on platforms to improve their policies and stop perpetuating hate and misinformation that disproportionately affect women, BIPOC individuals, and LGBTQ+ communities.
Last year, in September, we reported that Instagram was working on a privacy feature to protect users from receiving explicit content or nudity from unknown senders in direct messages. App developer Alessandro Paluzzi, the first to discover this feature, revealed that Instagram would display a cautionary message stating, “Instagram CAN’T access photos.” Paluzzi confirmed that the privacy feature is still in the early stages of development and will be optional for users.
In addition to the explicit image blocking feature, Instagram is implementing other changes to enhance user safety. One such change involves encouraging teenagers to designate their parents as supervisors of their accounts after blocking someone, providing an extra layer of support.
The platform is also globally launching its quiet mode feature, which was initially introduced in the UK, US, and Australia. Quiet mode allows users to disable notifications and sends automated replies to direct messages.
Furthermore, Instagram is considering a feature that prompts teens to close the app if they are using Reels, the platform’s short video feature, at night.