Goto

Collaborating Authors

How Dangerous Is Misinformation On Facebook?

NPR Technology

NPR's Scott Simon speaks with Roger McNamee, a former mentor to Mark Zuckerberg and an early investor in Facebook. He believes the company threatens democracy because it has helped spread fake news.


The Bots Aren't Coming For The Midterms … They're Already Here

NPR Technology

Twitter bots entered the popular political consciousness during the 2016 campaign. Trolls and artificial accounts were blamed with everything from spreading misinformation to swaying the final vote. Their ultimate effect on actual vote tallies may be something scientists -- both data and political -- try to figure out, but we do know that bots can fuel discord. A surprising number of posts on social media are made by bots. And while Twitter and Facebook have been trying to make sure fewer people see misinformation, it's still there.



Facebook didn't do enough to stop election misinformation, report says

Engadget

Facebook missed billions of opportunities to tamp down misinformation ahead of the 2020 presidential election. That's the conclusion of a new report from Avaaz, an advocacy group that researches misinformation online. Avaaz researchers analyzed 100 of the most popular Facebook pages that have repeatedly spread false claims. According to their analysis, posts shared by those pages were viewed more than 10 billion times between March and October. The report also faults Facebook's fact-checking policies, noting that "the top 100 false or misleading stories related to the 2020 elections" were viewed 162 million times in three months, even as Facebook's fact-checkers debunked the claims.


Facebook has removed 7 million posts for coronavirus misinformation

Engadget

If it seems like there's a lot of misinformation about the coronavirus pandemic on Facebook, that's because there is: Between April and June, the social network says it removed 7 million posts for spreading harmful misinformation about COVID-19. It added labels to an additional 98 million posts, which were deemed false by fact checkers, but didn't rise to the level of outright removal. The company released the statistics alongside its community standards enforcement report, which details content takedowns on the social network. Facebook doesn't typically include misinformation statistics in these reports, but the company has imposed stricter rules for claims about the coronavirus that pose "imminent harm." The company removes posts that spread false claims about cures or treatments for COVID-19, as well as other misinformation health organizations say is dangerous.