Instagram CEO unsure of what to do with 'deepfaked' video - says the company doesn't have a policy

Daily Mail - Science & tech

The CEO of Instagram has defended the company's decision not to take down a deepfaked video of Mark Zuckerberg two weeks after the doctored video was reported. Adam Mosseri told CBS' Gayle King - in his first US television interview since taking over the platform last year - that the company hasn't yet formulated an official policy on AI-altered video called'deepfakes', and until then taking action would be'inappropriate.' Mosseri said, 'I don't feel good about it,' but said there is no rush to remove the video, in part because'the damage is done.' Mosseri's comments about deepfakes come as a response to King's questioning about a faked video of Facebook CEO Mark Zuckerberg taken from an actual interview with CBSN in 2017. The doctored video features a fairly convincing Zuckerberg next to a superimposed CBSN logo talking about how Facebook wields power over its users.


Samsung developing algorithm that only needs one picture to create a fake video

Daily Mail - Science & tech

As if the world of deep-faked pictures and video wasn't scary enough, researchers from Samsung's AI center in Moscow have demonstrated an algorithm that can fabricate videos using only one image. In a video demonstration and a paper published in the pre-print journal ArXiv, the researchers show the capabilities of what is described as'one-shot' and'few-shot' machine learning. The results of their system bring to life popular faces like those of surrealist painter Salvador Dali and actress Marilyn Monroe using a single still image. The more images that are fed into the program, the more realistic the resulting video becomes. Though a single image translated into a moving face may look noticeably altered, a sample of 32 images produces a moving picture with near lifelike accuracy.


Adobe unveils new AI that can detect if an image has been 'deepfaked'

Daily Mail - Science & tech

Adobe researchers have developed an AI tool that could make spotting'deepfakes' a whole lot easier. The tool is able to detect edits to images, such as those that would potentially go unnoticed to the naked eye, especially in doctored deepfake videos. It comes as deepfake videos, which use deep learning to digitally splice fake audio onto the mouth of someone talking, continue to be on the rise. Adobe researchers have developed an AI tool that could make it easier to spot'deepfakes'. Deepfakes are so named because they utilise deep learning, a form of artificial intelligence, to create fake videos.


Deepfakes: What fairies and aliens can teach us about fake videos

#artificialintelligence

"Deepfake" is the name being given to videos created through artificially intelligent deep learning techniques. Also referred to as "face-swapping", the process involves inputting a source video of a person into a computer, and then inputting multiple images and videos of another person. The neural network then learns the movements and expressions of the person in the source video in order to map the other's image onto it to look as if they are carrying out the speech or act. This practice was first used extensively in the production of fake pornography in late 2017 – where the faces of famous female celebrities were swapped in. Research has consistently shown that pornography leads the way in technological adoption and advancement when it comes to communication technologies, from the Polaroid camera to the internet.


Deep fakes images of Earth could be used to trick military analysts, experts say

Daily Mail - Science & tech

The future of'deep fake' technology could be much worse than doctored videos of celebrities and politicians. U.S. military experts are raising concerns about a new variant of deep fakes, or videos that use AI to make subjects appear to say or do something they really didn't, that could involve doctored satellite images of the Earth, according to Defense One. It comes as social media giants, researchers and other experts have been working to outsmart deep fake videos. U.S. military experts are raising concerns about a new variant of deep fakes, or videos that use AI to manipulate its subjects, that could involve doctored satellite images of the Earth Deepfakes are so named because they utilize deep learning, a form of artificial intelligence. They are made by feeding a computer an algorithm, or set of instructions, lots of images and audio of a certain person.