Facebook Beefs up Suicide-Prevention Tools
Over the past several years, Facebook has implemented several procedures to help people in crisis and, on Wednesday, announced new tools to empower Facebook users to intervene when they believe that someone they know may be contemplating self-harm or suicide.
The announcement focused on new Facebook tools for helping people “in real time on Facebook live,” facilitating live chat support from crisis support organizations via Facebook Messenger and “streamlined reporting for suicide, assisted by artificial intelligence.
Facebook said that it already has 24/7 teams in place to review reports and prioritize the most serious, like suicide. The company also has been providing people who express suicidal thoughts with support, including advising them to reach out to a friend or a support organization. The company has worked with psychologists and support groups to recommend helpful text that you might use if someone you know on Facebook is expressing suicidal thoughts.
To the right of each Facebook post is a down arrow. If you click it you get options including “I think it (the post) doesn’t belong on Facebook.” Clicking that takes you to an option for posts that are “threatening, violent or suicidal,” and if you click that you’re taken to an area where you can get advice, including specific language if you want to offer help or support. That same area on the site or app also allows you to send a text to a trained counselor at the Crisis Text Line, call the Suicide Prevention Lifeline (800 237-8255) or confidentially report the post to Facebook so that their personnel can “look at it as soon as possible.”
The company has also produced an educational video http://tinyurl.com/FBselfharm that provides useful tips on how to respond if a friend is “in need.”