**Update: If you’re looking for our DeepNude article, well, you’re in the right place. We’ve updated this article to give you an idea of how dangerous DeepNude, as well as deepfakes technology in general, really is. So dangerous, in fact, that there is even new legislature outlawing them during election season (and in some countries, outlawing them altogether unless accompanied by a disclaimer). Read on to learn more.**
In the past few years, the growing use of misinformation to influence voters has loomed over elections. One of the most problematic developments leading up to 2020 has been the use of deepfakes. These AI-doctored videos overlay faces of celebrities or politicians on other bodies to create unnerving, realistic effects.
Now, California Governor Gavin Newsom has signed a law into effect that makes it illegal to produce or distribute deepfakes.
‘Powerful and Dangerous New Technology’
On Thursday (October 3) Newsom signed into law AB 730. The bill itself doesn’t contain the word “deepfake.” However, it’s safe to assume that California introduced the legislature to combat misleading videos using deepfake technology. The new law prohibits individuals from distributing video or audio intentionally altered to misrepresent or damage a candidate’s words and/or actions within 60 days of an election.
The bill also includes some exceptions. For one, it makes news outlets exempt from the law. Meanwhile, it also exempts videos that are parody or satire. Of course, these descriptors are growing more blurry every day thanks to online discourse. Individuals can also get around the bill by adding disclaimers to their video or audio which say that it is fake. The bill will stay in effect until 2023.
“Deepfakes are a powerful and dangerous new technology that can be weaponized to sow misinformation and discord among an already hyper-partisan electorate,” said Marc Berman, the assemblyman who introduced the bill. “Deepfakes distort the truth, making it extremely challenging to distinguish real events and actions from fiction and fantasy. AB 730 seeks to protect voters from being tricked and influenced by manipulated videos, audio recordings, or images before an election.”
Free Speech Roadblocks
Enforcing the bill is another matter as it raises a number of free speech considerations. Speaking with The Guardian, Jane Kirtley, a professor of media ethics and law at Hubbard School of Journalism and Mass Communication says that combating deepfake videos through copyright claims rather than law might be more effective.
“Political speech enjoys the highest level of protection under US law,” said Kirtley. “The desire to protect people from deceptive content in the run-up to an election is very strong and very understandable, but I am skeptical about whether they are going to be able to enforce this law.”
Porn Deepfakes Are Out Too
The main reason that deepfakes even became a thing was because of, well, porn. People online (read: dudes) have historically used deepfakes to target women. Indeed, an entire shady corner of the internet centers around fake porn videos containing the likenesses of celebrities.
On Thursday, Newsom also signed AB 602 into effect. The legislation allows individuals to sue if their likeness is used in sexually explicit videos or images without their consent. It’s a positive step as the use of deepfake technology in this sense probably won’t disappear anytime soon.