preventing suicide on facebook
Reading time: 2 min
suicide

The use of social media websites and apps can be mentally draining. Checking-in to see what your friends and family are up to on a day to day basis can be a bit overwhelming. You may even find yourself on Facebook asking yourself from time to time, ‘what good is any of this?’ To be sure, social media sites like Facebook can prove to be an invaluable resource for those who want to keep in touch with people from their past, giving you a window to the lives of others, who you’d otherwise be in the dark about. You can also use social media to discern how your loved ones are doing health wise, both physically and mentally.

While the majority of what people talk about on social media sites is trivial in nature, there are times when people use these platforms to express negative feelings about themselves. Potentially waving a red flag that could result in friends coming together to assist their loved one get the help they need. As you are probably aware, it would be an onerous task to scan all your Facebook friends’ timelines looking for signs of trouble. Let’s face it, some people have thousands of FB friends.

The people working at Facebook seem to understand how their platform can be used to avert catastrophe. Last summer, the company launched a suicide prevention tool. FB users could use the tool to flag the posts of their friends that are indicative of depression and suicidal thoughts. A team working at the company would then review flagged posts and advise the “flaggers” about how to talk with the friend they were concerned about.

Taking the mission to save lives one step further, the social media giant is working on helping people with mental illness without a friend even needing to flag a post. Facebook is testing the use of artificial intelligence (AI) algorithms they created that identify potential warning signs in users’ posts and the friends’ comments, BBC reports. The algorithm tests will be used only in the U.S. initially.

After the tool identifies posts of concern, it will be reviewed by a team of humans to determine its validity, according to the article. If the team confirms that a user is at risk of self-harm, the “social network” will reach out to them and suggest helpful resources. The use of AI to prevent suicide is actually just the tip of the iceberg, please watch the short video for more information on the novel methods being employed at Facebook:

“Their ongoing and future efforts give me great hope for saving more lives globally from the tragedy of suicide,” said Dr Dan Reiden executive director of Save.org, which is involved in the initiative. “The opportunity for prevention, even with Facebook Live, is better now than ever before.”