After receiving so much criticism on not doing enough in preventing suicide or further harm/killings on the social site, Facebook is finally expanding its limited test run for suicide- and -self-harm reporting tools to the masses.
Therefore, to get better at detection Facebook will begin implementing pattern recognition for posts and Live videos to detect when someone could be presenting suicidal thoughts. From there, VP of product management Guy Rosen writes that the social network will also concentrate efforts to improve alerting first responders when the need arises. Furthermore, Facebook will also have more humans looking at posts flagged by its algorithms.
Currently the passive/AI detection tools are only available in the US, but soon those will roll out across the globe — European Union countries notwithstanding. In the past month, Facebook has pinged over 100 first responders about potentially fatal posts, in addition to those that were reported by someone’s friends and family.
Apparently, “Are you okay?” and “Can I help?” comments are good indicators that someone might be going through a very dark moment. More than that, Rosen says that thanks to the algorithms and those phrases, Facebook has picked up on videos that might’ve otherwise gone unnoticed prior.
Between Facebook’s role in the 2016 election and that it has come under fire for experimenting with whether or not gaming your News Feed can alter your mood, the company needs to work on repairing its image these days. Stories like this can help, but until there are more successes than unfortunate happenstances the social network needs to keep at it.
If you or someone you know is experiencing suicidal thoughts, do not hesitate to contact the National Suicide Prevention Lifeline at 1-800-273-8255. The line is open 24/7 and there’s also online chat if a phone isn’t available.