Tech Policy

The internet is home to at least 14,678 deepfakes, according to a new report by DeepTrace, a company that builds tools to spot synthetic media. But most of them weren’t created to mess with elections. 

Back to the beginning: Deepfakes arrived on the scene in late 2017. The word was originally used to describe AI-generated fake porn, with the heads of actresses on the bodies of adult film stars. Since then, the meaning of “deepfakes” has expanded to refer to any kind of manipulated video, like one with Mark Zuckerberg giving a fake speech about Facebook. This has stoked fears about the end of truth and the potential of deepfakes to swing elections. 

The internet is for porn: The report found that most of the videos aren’t about trying to influence politics. A full 96% of the deepfakes were still plain old fake porn.  “Deepfake pornography is a phenomenon that exclusively targets and harms women,” the authors write. All the fake porn contained women, mostly famous actresses and musicians. (The remaining nonpornographic deepfakes on YouTube mostly included men.) The number itself isn’t that high, but it’s worrying that it is growing so quickly.

Fighting back with law: The issue has caught the attention of legislators. In California, Governor Gavin Newsom just signed into law two bills that limit what people can do with deepfakes. One law makes it illegal to make and distribute a malicious deepfake of a politician within two months of an election. (The ACLU and Electronic Frontier Foundation have already pushed back, saying the law is too broad and will hurt political speech.) The second deepfake law gets closer to how the manipulations are really being used. It lets people sue if their picture is used in deepfake porn without consent