Facebook bans 'deepfake' videos in run-up to US election

The Guardian

Facebook has announced a new policy banning AI-manipulated "deepfake" videos that are likely to mislead viewers into thinking someone "said words that they did not actually say," as the social network prepares for the 2020 US election. But the new policy explicitly covers only misinformation produced using AI, meaning that "shallow fakes" – videos made using conventional editing tools – though frequently just as misleading, are still allowed on the platform. The announcement made on Monday by Monika Bickert, Facebook's head of global policy management, sees the company removing misleading video from Facebook and Instagram if it meets two criteria: "It has been edited or synthesised … in ways that aren't apparent to an average person and would likely mislead someone into thinking that a subject of the video said words that they did not actually say." "It is the product of artificial intelligence or machine learning that merges, replaces or superimposes content onto a video, making it appear to be authentic." To date, there have been no major examples of content that would break such rules.


Look out: ads are coming to Facebook Live

USATODAY - Tech Top Stories

LOS ANGELES -- Look out Facebook fans, the social network is going to increase the numbers of ads we see in our videos. The company Thursday announced new initiatives to put more TV commercial-style ad breaks in live videos, and will begin testing ad breaks in traditional News Feed videos as well. To try out an ad in your live video, and participate in the revenue, Facebook says you need to have at least 2,000 followers and an audience of at least 300 viewers in a recent live video. But good news for viewers--unlike TV, you won't see the (up to 20 second) ad until at least 4 minutes into the live video. After that, publishers can take additional ad breaks "after a minimum of 5 minutes between each break," says Facebook.


Facebook: We don't allow content that tricks people for profit

ZDNet

Amid outcry that Facebook continues to give a platform to vile and harmful content, the social network on Thursday laid out its broad-based framework for when it does and doesn't censor content. "We do not... allow content that could physically or financially endanger people, that intimidates people through hateful language, or that aims to profit by tricking people using Facebook," Facebook's VP of Policy Richard Allan wrote in a blog post. The post comes on the same day Facebook confirmed it will ban websites that share blueprints for 3D-printed guns. It also follows Facebook's decision earlier in the week to remove pages belonging to Alex Jones, the notorious conspiracy theorist who has propogated the belief that the 2012 Sandy Hook school shooting was a hoax. Facebook said it came to the decision to remove the pages on its own, though the decision was announced shortly after Apple said it would remove Jones' podcasts from its platforms.


Former Microsoft employees sue over PTSD from reviewing child porn

Mashable

Two former Microsoft employees are suing the company because they say the disturbing content they had to view for their jobs caused them to experience post traumatic stress disorder (PTSD). Both men are suing the tech giant for damages, alleging disability discrimination, violations of the Consumer Protection Act and negligence. The amount is to be decided during trial, according to the complaint filed in district court. When they confronted their employer about the trauma they experienced, Microsoft told them to play video games or take more smoke breaks, instead of providing adequate mental health services, the suit alleges. Greg Blauert and Henry Soto's lawsuit, filed on Dec. 30 of last year, says that for years they had to watch horrifying videos on the internet in order to help keep Microsoft's platforms free from content that would disturb its users or break the law.


Facebook Video: Insight From a Facebook Watch Success Story

#artificialintelligence

Wondering how creators succeed with video on Facebook Watch? Curious how it compares to other social media video? To explore what marketers can learn from a successful Facebook Watch creator, I interview Rachel Farnsworth. The Social Media Marketing podcast is designed to help busy marketers, business owners, and creators discover what works with social media marketing. Her Facebook Watch show, Recipes, has more than 4 million subscribers. Rachel explains how her experience with Facebook Watch compares to videos on her Facebook page and YouTube channel. You'll also discover tips for measuring Facebook video performance and running ads on Facebook Watch. Share your feedback, read the show notes, and get the links mentioned in this episode below. Here are some of the things you'll discover in this show: In 2008, when Rachel became a stay-at-home mom, she started a blog.