For better or worse, we’re entering the era of deepfakes

0
791
Deepfakes have seized the public consciousness...but are they good?

As is often the case online, it starts with porn.

Late last year, Motherboard reported on the Reddit user Deepfakes, who was allegedly behind videos that used machine learning to superimpose the faces of celebrities like Gal Gadot and Daisy Ridley into porn scenes. Within months, Nicolas Cage was recast in “Raiders of the Lost Ark” and “Star Trek” and Jordan Peele was putting dirty words in the mouth of President Obama. Soon enough, you’ll be able to make a video of your entire body mimicking the exact movements of professional athletes.

The new frontier of AI-assisted visual learning saw major breakthroughs in 2018. But it comes with baked-in ethical issues that concern everyone from famous celebrities to national security experts to AI-assisted news anchors on Chinese state media. Namely: If seeing is believing, where do we go from here?

Open Source of Power

Deepfake videos are made using the machine learning technique known as “generative adversarial network,” or a GAN. Developed in 2014 by graduate student Ian Goodfellow, the technique pioneered “a way to algorithmically generate new types of data out of existing data sets,” according to The Guardian. In other words, a GAN will study thousands of photos of a person, then produce a unique image of them, akin to a portrait that has not yet been taken.

Originally confined to AI researchers, GANs have made their way into the mainstream. The first deepfake porn videos were made using Google’s free open source machine learning software TensorFlow. Now, the desktop program FakeApp makes it easier than ever for anyone with a good computer and some patience to make their own deepfake.

High Tech Meets Fake News

Because pornographic deepfake videos literally objectify the celebrity and are posted without their consent, Reddit, Twitter, and Pornhub have all banned them on their sites. However, there’s another area for concern: The use of deepfake videos to spread political misinformation.

In April, BuzzFeed News and Jordan Peele partnered on a video to warn of the coming dangers of deepfakes.

As the video shows, a good celebrity impersonator can use a deepfake to make a political figure say anything. And the technology isn’t limited to just video.

Siri as Celebrity Impersonator

A recent CNBC report makes clear that political and security experts are taking this problem seriously. In conjunction with the London-based AI firm ASI Data Science, the newly-created Transatlantic Commission on Election Integrity created an online quiz that illustrates the power of AI-assisted audio forgery.

In the quiz, researchers took the lines recited by a variety of Donald Trump impersonators and fed them into an audio algorithm programmed to mimic Trump’s speech. That audio is played side-by-side with the impersonators, and users can vote on which sounds more like Trump. Although its voice is tinny and speaks in a stilted cadence, users overwhelmingly identified the AI-audio as sounding more like Trump in each of the four examples.

People Have Already Been Duped

Because deepfake technology is still in its infancy, the quality is often poor enough that viewers can quickly spot the forgery. Still, with social media becoming a tinderbox for fake news, even poorly edited deepfakes can draw outrage from the digital mob.

Take the recent deepfake video posted by the Belgian political party sp.a. In it, President Trump bellows at Belgians to follow America’s lead and withdrawal from the Paris climate accord. Although the impression is decent at best and the motion of Trump’s mouth looks beamed in from deep in the uncanny valley (or the fact that it explicitly gives up the game at the end and urges viewers to act on climate change), the post generated hundreds of comments on the sp.a Facebook page, enough that the party had to further clarify through social media that it was not a real video.

Artistic Applications

So there are clearly problems that could arise as a result of deepfakes. However, others are utilizing the technology for more creative ends.

The award-winning British artist Gillian Wearing has incorporated deepfakes into her art, as well as her marketing. In a new video that’s part of her Cincinnati Art Museum exhibition “Life: Gillian Wearing,” the conceptual artist cast people to digitally wear her face using deepfake technology.

“I was very interested in using this technology to question the veracity of truth and identity, which are relevant to a lot of things we’re all in the midst of at this moment in time,” Wearing told Fast Company. “I just thought this was an interesting time to think about how I put myself across to people who don’t know who I am, for this particular piece. There is a lot of noise out there, so how do people stand out? It’s an interesting question.”

Snapped Into It

As this technology improves and is easier to apply, it’s easy to imagine users wanting to insert themselves into videos. After all, many smartphone users are already familiar with Face Swap, which utilizes similar facial recognition technology, as well as Instagram and Snapchat filters that alter their physical appearance. And the popularity of Bitmoji suggests that people are eager to create realistic digital avatars. 

The ability to map your full body onto someone else is just around the corner. A new report in Futurism spotlighted a recent video produced by researchers at Heidelberg University who revealed that they were able to “copy a full body movement from a video and transfer it onto a target person.” While the current iteration is crude and easily identifiable as a fake, the AI shows promise, including the ability to extrapolate details not captured on camera, like a belt extending all the way around a waist.

Strictly for the LOLZ

And then there are those who are simply using the tech for humorous parodies. YouTube user Derpfakes has been uploading clips to his channel for nearly a year, many of which feature internet darling Nicolas Cage in a variety of iconic roles. The most recent is a clever riff on Face/Off, in which Cage’s face is digitally transferred onto John Travolta’s.

Despite the lighthearted nature of his videos, Derpfakes seems well aware of the potential risks of the technology. Speaking anonymously with ABC News, the Youtuber admitted that “all deepfakes have a level of controversy attached to them,” before adding, “from my perspective the best approach… is to make the public familiar with the idea that what they see is not true by default… Either way, it’s here to stay.”

The AI researchers at Heidelberg University agree. “Generally speaking, any improvement by the research community on the topic of image synthesis carries the inherent risk of being misused,” Björn Ommer told Futurism.

Still, the potential of misuse has rarely halted technological exploration and ultimately, ethics are dictated by the end user. We could use this technology to delight our children by putting them inside a scene from “Star Wars,” or to humiliate our ex with deepfake revenge porn. The choice will be ours to make, and based on how fast the technology is advancing, we may be forced to make it soon.