Social Media giant, Facebook, has launched a new Artificial Intelligence tool to detect non-consensual intimate images (sometimes referred to as revenge porn) with a view to curbing its circulation. Facebook already has a policy of removing this kind of images but this new AI tool will find this content more quickly. Facebook is also launching an online resource hub to help people respond when this abuse occurs.
Facebook intends to use machine learning and artificial intelligence to proactively detect near nude images or videos that are shared without permission on Facebook and Instagram. This will help Facebook detect the content before anyone reports it. According to Facebook, this is important for two reasons: “often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared. A specially-trained member of our Community Operations team will review the content found by our technology. If the image or video violates our Community Standards, we will remove it, and in most cases we will also disable an account for sharing intimate content without permission. We offer an appeals process if someone believes we’ve made a mistake.”
“This new detection technology is in addition to our pilot program jointly run with victim advocate organizations. This program gives people an emergency option to securely and proactively submit a photo to Facebook. We then create a digital fingerprint of that image and stop it from ever being shared on our platform in the first place. After receiving positive feedback from victims and support organizations, we will expand this pilot over the coming months so more people can benefit from this option in an emergency.” so says Facebook’s Antigone Davis, Global Head of Safety.
In doing more to help people who have been the targets of this cruel and destructive exploitation, Facebook is also launching “Not Without My Consent,” a victim-support hub in its Safety Center that the company developed together with experts. Here victims can find organizations and resources to support them, including steps they can take to remove the content from our platform and prevent it from being shared further — and they can access Facebook’s pilot program. Facebook is also going to make it easier and more intuitive for victims to report when their intimate images were shared on Facebook.