Can Facebook's new approach help prevent suicides?

#artificialintelligence

March 1, 2017 --One reason people may commit suicide, behavioral scientists say, is that they feel isolated from โ€“ or not supported by โ€“ their communities. With its latest rollout of suicide prevention tools, Facebook aims to empower its vast online community to take action and prevent these deaths. On Wednesday, the tech giant announced the latest elements of its suicide-prevention strategy. Under the new changes, a Facebook user who is worried that friend streaming on Facebook Live may be contemplating suicide can report the video to Facebook, which will then provide resources, including suicide prevention tips and ways to get help via live chat, in real-time to both users. And if friends report that someone they know might be considering suicide, Facebook will direct them to tools to help them reach out.


Facebook enlists AI tech to help prevent suicide

#artificialintelligence

Can Facebook use all that it knows about us to help stop someone from dying by suicide? It's been more than a rhetorical question since January, after a video, pulled from the social media platform Live.Me and shared on Facebook, showed a 12-year-old Katelyn Nicole Davis taking her own life. Facebook couldn't control the spread of the video and appeared unsure if it even violated its own terms of service. A month later, Facebook CEO Mark Zuckerberg's 6,000 word global community manifesto made it clear that Facebook is ready to take on a more parental role, one that acknowledges its incredible influence and impact over nearly 2 billion people around the world. The Facebook community is in a unique position to help prevent harm, assist during a crisis, or come together to rebuild afterwards.


Facebook enlists AI tech to help prevent suicide

Mashable

Can Facebook use all that it knows about us to help stop someone from committing suicide? It's been more than a rhetorical question since January, after a video, pulled from the social media platform Live.Me and shared on Facebook, showed a 12-year-old Katelyn Nicole Davis taking her own life. Facebook couldn't control the spread of the video and appeared unsure if it even violated its own terms of service. A month later, Facebook CEO Mark Zuckerberg's 6,000 word global community manifesto made it clear that Facebook is ready to take on a more parental role, one that acknowledges its incredible influence and impact over nearly 2 billion people around the world. The Facebook community is in a unique position to help prevent harm, assist during a crisis, or come together to rebuild afterwards.


Facebook uses artificial intelligence to help prevent suicides

PCWorld

Facebook is using a combination of pattern recognition, live chat support from crisis support organizations and other tools to prevent suicide, with a focus on its Live service. There is one death by suicide every 40 seconds and over 800,000 people kill themselves every year, according to the World Health Organization. "Facebook is in a unique position -- through friendships on the site -- to help connect a person in distress with people who can support them," the company said Wednesday. The move by Facebook appears to aim to prevent the live-streaming of of suicides on the Live platform, which was launched in April last year, and allows people, public figures and pages to share live videos with friends and followers. The company said that its suicide prevention tools for Facebook posts will now be integrated into Live, giving people watching a live video the option to reach out to the person directly and to report the video to the company.


Facebook rolling out AI tools to help prevent suicides

#artificialintelligence

In yet another attempt to prevent suicides, Facebook is starting to roll out Artificial Intelligence (AI)-based tools to help identify when someone might be expressing thoughts of suicide, including on Facebook Live. The initiative โ€“ that will use pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide to help authorities respond faster -- will eventually be available worldwide, except the European Union, Facebook said in a blog post on Tuesday. "Facebook is a place where friends and family are already connected and we are able to help connect a person in distress with people who can support them. "It's part of our ongoing effort to help build a safe community on and off Facebook," wrote Guy Rosen, Vice President of Product Management at Facebook. In October, Facebook worked with first responders on over 100 wellness checks based on reports it received via its proactive detection efforts. "We use signals like the text used in the post and comments (for example, comments like "Are you ok?" and "Can I help?" can be strong indicators). "In some instances, we have found that the technology has identified videos that may have gone unreported," Rosen said.