featured-image

Facebook Outlines Ongoing Efforts to Prevent Self-Harm to Mark World Suicide Prevention Day

While social media platforms have done a lot of societal good in connecting us to other people and cultures, there are also significant downsides to the public nature of social sharing, which is particularly notable in younger users and those more susceptible to criticism and judgment.

Various studies have linked social media use to increased rates of depression and anxiety, and indeed self-harm, and as such, there is a level of onus on the platforms themselves to better police user activity, and to provide help where possible, as much as they can.

 

This week, to mark World Suicide Prevention Day, Facebook has provided an overview of its ongoing efforts to detect and address posts related to, or indicating, possible concerns in this respect. And while it's not a problem that can ever be 'solved' as such, Facebook says that it is making significant progress:

"From April to June of 2019, we took action on more than 1.5 million pieces of suicide and self-injury content on Facebook and found more than 95% of it before it was reported by a user. During that same time period, we took action on more than 800 thousand pieces of this content on Instagram and found more than 77% of it before it was reported by a user." 

Among the various advances, Facebook says that it has expanded its bans on "graphic cutting images" and eating disorder content, resulting in less exposure for at-risk users. Facebook also now displays a sensitivity screen over "healed self-harm cuts to help avoid unintentionally promoting self-harm". 

In addition to advancing its AI, and providing content warnings, Facebook says that it's now also looking to hire a health and well-being expert to join its safety policy team.

"This person will focus exclusively on the health and well-being impacts of our apps and policies, and will explore new ways to improve support for our community, including on topics related to suicide and self-injury." 

Facebook's also exploring ways to share public data from its platform for research into how people talk about suicide, and aiming to partner with academic organizations to establish even more effective detection processes. 

"We're eager to make [the data] available to two select researchers who focus on suicide prevention to explore how information shared on Facebook and Instagram can be used to further advancements in suicide prevention and support."

Facebook's also adding new resources to its Safety Center to help at-risk users, including Orygen’s #chatsafe guidelines.

"The #chatsafe guidelines were developed together with young people to provide support to those who might be responding to suicide-related content posted by others or for those who might want to share their own feelings and experiences with suicidal thoughts, feelings or behaviors."

In combination, the various measures will help improve Facebook's processes, and provide more support for users - though it would be good to see Facebook also follow Pinterest's lead and add in simple mental health exercises which show up in related search queries.

Pinterest self help exercises

That said, Facebook's advances in detection are significant, and are working to help an increasing amount of at-risk users to address their concerns in a more constructive way. As noted, Facebook, and all social platforms, need to make this a focus. If they're going to make money off such engagement, they should work to provide such tools, and address their impacts.    

Facebook users can also check out Facebook's Suicide Prevention Center at any time for additional tools and resources to assist.

 



 
[ Back ]