Facebook will start removing misleading and inflammatory posts that may trigger violent attacks, the social network said Wednesday, as it faces criticism over its response to sectarian conflict in countries such as Myanmar and Sri Lanka. The policy applies to written posts and manipulated images. Civil-society groups and threat-intelligence agencies are among the partners that Facebook said will help the company flag incendiary posts and review their potential impact. Facebook said that its local and international partners must verify if the information they share is false and show that the material could contribute to imminent violence. Once the threat is confirmed, Facebook said it will remove the content and take down similar posts.
Experts have blasted Facebook over its latest content guidelines, with one prominent techno-sociologist stating the company was in'over its head'. The social network's stance on content moderation has come under intense scrutiny during recent days, following comments from CEO Mark Zuckerberg as he tried to explain where, and how, Facebook draws the line on questionable posts. Facebook has strict guidelines governing posts that contain nudity, the selling of guns, and credible and explicit threats of violence, however grey areas still exist over what should and should not be allowed on the site. In an attempt to clarify where the company draws sees the line, Zuckerberg said Holocaust deniers should not have their opinions removed from the social media platform as they are entitled to exercise their right of free speech. However, experts have criticised this approach.
Mark Zuckerberg defended the rights of Facebook users to publish Holocaust denial posts, saying he didn't "think that they're intentionally getting it wrong". In an interview with Recode published Wednesday, the CEO also explained Facebook's decision to allow the far-right conspiracy theory website Infowars to continue using the platform, saying the social network would try and "reduce the distribution of that content", but would not censor the page. Zuckerberg's comments came the same day that Facebook announced a new policy pledging to remove misinformation used to incite physical harm. The CEO's remarks to Recode have reignited debates about free speech on the social network at a time when Facebook is continuing to face scrutiny over its role in spreading misinformation, propaganda and hate speech across the globe. Last year, the Guardian reported on internal Facebook moderation documents which suggested that the company flouted Holocaust denial laws except in countries where it was likely to be sued or prosecuted.
Facebook wants you to know it's trying really hard to deal with the ways people use its platform to cause harm. It just doesn't know exactly what to do. What separates hate speech from offensive ideas, or misinformation from coordinated disinformation intended to incite violence? What should the company allow to remain on the platform, and what should it ban? Two years after Russians weaponized Facebook as part of a large-scale campaign to interfere with US democracy, the social network is still struggling to answer those questions, as the past two weeks have made clear.
Facebook on Wednesday announced a new policy for removing misinformation, including altered imagery, from the platform that is intended to cause or exacerbate violence. Misleading and inaccurate information spread through the social media site has been linked to violence in Myanmar and Sri Lanka. Extremist Buddhists in Sri Lanka have also taken to Facebook to post misinformation denigrating the Muslim minority, which has led to communal attacks. A Facebook spokesperson wrote in a statement, "Reducing the distribution of misinformation--rather than removing it outright--strikes the right balance between free expression and a safe and authentic community. There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down.