Goto

Collaborating Authors

Facebook uses artificial intelligence to help prevent suicides

PCWorld

Facebook is using a combination of pattern recognition, live chat support from crisis support organizations and other tools to prevent suicide, with a focus on its Live service. There is one death by suicide every 40 seconds and over 800,000 people kill themselves every year, according to the World Health Organization. "Facebook is in a unique position -- through friendships on the site -- to help connect a person in distress with people who can support them," the company said Wednesday. The move by Facebook appears to aim to prevent the live-streaming of of suicides on the Live platform, which was launched in April last year, and allows people, public figures and pages to share live videos with friends and followers. The company said that its suicide prevention tools for Facebook posts will now be integrated into Live, giving people watching a live video the option to reach out to the person directly and to report the video to the company.


Facebook adds artificial intelligence to its suicide prevention tools

#artificialintelligence

After building up the human component of their network of suicide prevention organizations, Facebook is now bringing in the machines: The social media company announced they have updated their suicide prevention tools with artificial intelligence to identification of those at risk as well as improve the reporting process and speed up response time. On a company blog post from Vanessa Callison-Burch, Jennifer Guadagno and Antigone Davis (Facebook's Product Manager, Researcher, and Head of Global Safety, respectively) described key features of the updates, which include the integration of their prevention tools with Facebook Live and a testing phase of AI-powered pattern recognition to identify posts likely to include thoughts of suicide. "Based on feedback from experts, we are testing a streamlined reporting process using pattern recognition in posts previously reported for suicide," the company stated. "This artificial intelligence approach will make the option to report a post about'suicide or self injury' more prominent for potentially concerning posts like these." As Facebook tests that tool, employees will review posts flagged by the software and provide resources if the situation calls for it, even if no one has reported the post yet.


Facebook battles suicides

FOX News

Recent news reports have highlighted several disturbing reports of teens and tweens live streaming their own suicides on Facebook and other social networks. In response, Facebook on Wednesday announced that its suicide prevention tools will now be integrated into Live, so if you're watching a broadcast and someone expresses suicidal thoughts, you can report the video and get the person help. If you're ever in this scenario, press "report live video," and when Facebook asks you what's going on, select the option that says "suicide or self-injury." From there, the broadcaster will see suicide prevention resources on their screen, including a phone number for a crisis help line. When you report a suicidal video, Facebook will also show you information on how to help your friend.


Facebook is offering tools to prevent suicides on Facebook Live

#artificialintelligence

Facebook is turning to artificial intelligence to help identify posts showing suicidal tendencies, putting the technology in the powerful position to possibly help save lives. The social network, which has more than 1.8 billion users, announced a suite of new suicide prevention tools on Wednesday, including a streamlined reporting process aided by artificial intelligence, an easier way to get help during a Facebook Live video, and the option to get help via Messenger. But is a machine better than a human when it comes to flagging troubled posts? For now, Facebook is testing its algorithm in the United States, using pattern recognition in posts previously reported for suicide. Facebook will then make the option to report a post more prominent for anything that may be a red flag.


Facebook enlists AI tech to help prevent suicide

#artificialintelligence

Now Facebook is ready, the company announced on Wednesday, to take a first and significant step in building a safer and more supportive Facebook community by significantly strengthening its own suicide prevention tools (Facebook has had suicide reporting and tools for a decade). Facebook is also testing a pattern recognition system, that will identify posts that include suicidal thoughts. In addition and perhaps in acknowledgement of the Live.Me video tragedy, Facebook is also introducing suicide prevention tools to Facebook Live posts. The updated system will also offer the option to connect directly with someone from several suicide prevention organizations including Crisis Text Line and the National Suicide Prevention Line.