Most Deepfakes Are Porn, and They're Multiplying Fast
In November 2017, a Reddit account called deepfakes posted pornographic clips made with software that pasted the faces of Hollywood actresses over those of the real performers. Nearly two years later, deepfake is a generic noun for video manipulated or fabricated with artificial intelligence software. The technique has drawn laughs on YouTube, along with concern from lawmakers fearful of political disinformation. Yet a new report that tracked the deepfakes circulating online finds they mostly remain true to their salacious roots. Startup Deeptrace took a kind of deepfake census during June and July to inform its work on detection tools it hopes to sell to news organizations and online platforms.
Oct-7-2019, 14:17:07 GMT
- Country:
- Asia
- Malaysia (0.05)
- South Korea (0.05)
- North America > United States
- California (0.05)
- District of Columbia (0.05)
- Virginia (0.05)
- Asia
- Genre:
- Research Report (0.32)
- Industry:
- Government > Regional Government
- Information Technology > Security & Privacy (1.00)
- Media > News (1.00)
- Technology:
- Information Technology
- Artificial Intelligence
- Machine Learning > Neural Networks (1.00)
- Vision (1.00)
- Communications > Social Media (1.00)
- Artificial Intelligence
- Information Technology