Goto

Collaborating Authors

Internet Archive rolls out fact-checking on archived webpages

Mashable

Even the Wayback Machine is getting into fact-checking now. In a blog post on its website, the Internet Archive announced it was rolling out fact-checking annotations on certain webpages archives by its Wayback Machine. According to Mark Graham, director of the Wayback Machine, the organization felt the need for this feature after noticing a number of fact-checking groups linking to archived versions of pages. "We are attempting to preserve our digital history but recognize the issues around providing access to false and misleading information coming from different sources," Graham wrote in the post. "By providing convenient links to contextual information we hope that our patrons will better understand what they are reading in the Wayback Machine."


How Social Media Platforms Could Flatten the Curve of Dangerous Misinformation

Slate

On Thursday, Casey Newton reported that Facebook is piloting a circuit breaker to stop viral spread of posts in some circumstances. Such a tool if it had been adopted earlier, as one of us (Goodman) proposed and the Center for American Progress also advanced this week, might have helped stop QAnon's toxic spread, and might still staunch the flow of dangerous incitement and misinformation by introducing friction into the algorithmic amplification of dangerous conspiracies. The news about Facebook comes in the same week that the major social media platforms, having been warned that it was coming, acted quickly to stop the Plandemic sequel from becoming viral. Things went very differently when the first Plandemic video appeared in May, going viral with lies that masks are dangerous and social distancing unnecessary. The video spread with such velocity that it was viewed more than 8 million times in a week before YouTube, Facebook, and Twitter all removed it for violating their policies. The video was a digital pathogen with a high rate of infection--what virologists call the R-naught.


Facebook is reportedly testing a 'virality circuit breaker' to stop misinformation

Engadget

Facebook is reportedly piloting a new way to check viral posts for misinformation before they spread too far, The Interface reports. The method is a kind of "virality circuit breaker" that slows the spread of content before moderators have a chance to review it for misinformation. In a recent report, the Center for American Progress (CAP) recommended virality circuit breakers, which automatically stop algorithms from amplifying posts when views and shares are skyrocketing. Theoretically, that gives content moderators time to review the posts. According to The Interface, Facebook says it's piloting an approach that resembles a virality circuit breaker, and it plans to roll it out soon.


Facebook's dilemma: How to police claims about unproven virus vaccines

The Japan Times

LONDON/NEW YORK – Since the World Health Organization declared the novel coronavirus an international health emergency in January, Facebook Inc. has removed more than 7 million pieces of content with false claims about the virus that could pose an immediate health risk to people who believe them. The social media giant, which has long been under fire from lawmakers over how it handles misinformation on its platforms, said it had in recent months banned such claims as "social distancing does not work" because they pose a risk of "imminent" harm. Under these rules, Facebook took down a video post on Wednesday by U.S. President Donald Trump in which he claimed that children are "almost immune" to COVID-19. But in most instances, Facebook does not remove misinformation about the new COVID-19 vaccines that are still under development, according to the company's vaccine policy lead Jason Hirsch, on the grounds that such claims do not meet its imminent harm threshold. Hirsch said the company is "grappling" with the dilemma of how to police claims about new vaccines that are as yet unproven.


Facebook leak reveals policies on restricting New York Post's Biden story

The Guardian

Facebook moderators had to manually intervene to suppress a controversial New York Post story about Hunter Biden, according to leaked moderation guidelines seen by the Guardian. The document, which lays out in detail Facebook's policies for dealing with misinformation on Facebook and Instagram, sheds new light on the process that led to the company's decision to reduce the distribution of the story. "This story is eligible to be factchecked by Facebook's third-party factchecking partners," Facebook's policy communications director, Andy Stone, said at the time. "In the meantime, we are reducing its distribution on our platform. This is part of our standard process to reduce the spread of misinformation. We temporarily reduce distribution pending factchecker review."