Facebook is now offering a raft of features aimed at preventing potential suicides worldwide.
The Menlo Park-based social media giant says that suicide prevention features include allowing users to tag posts anonymously from friends who are exhibiting signs of self-harm. Facebook will then send the suspected suicide-prone person a message of concern including links to suicide prevention resources and hotlines.
"Now, with the help of these new tools, if someone posts something on Facebook that makes you concerned about their well-being, you can reach out to them directly and you also can report the post to us. We have teams working around the world, 24/7, who review reports that come in," Facebook global head of safety Antigone Davis and researcher Jennifer Guadagno said, The Register reported.
Flagged suicidal posts will be subjected to review by a team trained in communicating with suicide-risk person. As reported by the New York Times, suicide rates in the United States has reached a 30-year high.
Suicide rate is particularly high among middle-aged Americans and women implying a sense of desperation nationwide. As result of an alarming increase of self-inflicted death and harm, President Barack Obama announced a World Suicide Prevention Day in September last year to highlight the importance of identifying early warning signs of mental health problems and promote community support for suicide-risk individuals.
With its vast global reach and an enormously diverse user base, Facebook has long been involved in debates of great societal importance. Currently, 77% of the American population are Facebook users and 77% of them are women as per Pew Research in a 2015 study.
But there is a fine line between being trying to help prevent suicide and being perceived as the watchful 'Big Brother.' While some people appreciate Facebook's current thrust to be a positive force in promoting a social good, the suicide prevention features also run the risk of raising privacy concerns among its users.
Two years ago, the company was forced to revamp its user research methodology after concerns emerged that Facebook researchers utilized the social networking site's news feed feature in manipulating users' emotions. Just recently, the social media firm was also under fire for what some users perceive as politically-biased news feed manipulation that favor liberal views.