DeepNude app stirs controversy over deepfakes and pornography AI
Image: DeepNude

Even though the technology behind deepfakes is undoubtedly impressive, it seems that news surrounding its use is routinely negative. The latest deepfake news fell into that bucket as users online uncovered an app that generates realistic nude pictures of women. 

The app–cleverly titled DeepNude–was first uncovered by Motherboard. Users could feed DeepNude any picture of a woman wearing clothes. The app would remove the clothing resulting in realistic nudes. It’s another instance of someone using deepfake technology for ends that are ultimately disconcerting. 

How Did DeepNude Work?

According to Motherboard, a man who identifies himself as “Alberto” created DeepNude. He told Motherboard that he got the idea from “X-ray specs” ads that he had seen in the ‘60s and ‘70s. 

Advertisement

“Like everyone, I was fascinated by the idea that they could really exist and this memory remained,” Alberto told Motherboard. “About two years ago I discovered the potential of AI and started studying the basics. When I found out that GAN networks were able to transform a daytime photo into a nighttime one, I realized that it would be possible to transform a dressed photo into a nude one. Eureka. I realized that x-ray glasses are possible!”

Despite DeepNude giving users the power to generate naked pictures of clothed women, Alberto insisted that he is not a “voyeur.” He instead characterizes himself as a “technology enthusiast.” Moreover, to Alberto, DeepNude is no different from someone using Photoshop to remove clothing and alter photos.

“I also said to myself: the technology is ready (within everyone’s reach),” Alberto said. “So if someone has bad intentions, having DeepNude doesn’t change much… If I don’t do it, someone else will do it in a year.”

Since the Motherboard article ran, DeepNude (which you could buy a premium version online for $99) has been pulled down. DeepNude cited concerns that people would “misuse” the app while adding that the “world is not yet ready for DeepNude.” 

Deepfakes targeting women and other vulnerable groups

Since its inception, people have routinely used deepfakes to create falsified porn videos. DeepNude is one drop in the bucket of AI-technology targeting women. 

A June HuffPost article highlighted six women whose likeness and images were inserted into porn videos without their consent. The article also pointed out how deepfake pornography is now a sizeable niche online. Many tube sites carry deepfake videos. And while most concerns about deepfakes come from people creating fake videos that feature prominent political leaders, most of the time deepfakes simply target the less privileged. 

“The harm done to women when it comes to this kind of sexual objectification is happening now,” said Mary Anne Franks, a law professor at the University of Miami and president of the Cyber Civil Rights Initiative who spoke with HuffPost. “It’s almost like people have forgotten that this is what this technology really started out as, and the conversation around women has fallen away.”

Deepfakes might also target minorities, LGBTQ, and other marginalized groups down the road. While not an instance of deepfake technology, back in 2017 CNN uncovered social media accounts linked to the Russian government that targeted black communities. Posing as a group called Blacktivist, the accounts looked to stoke racial tensions. 

With deepfake technology improving, people looking to cause more confusion and discord may have another tool at their disposal. And the people they will most likely target will be those who have little recourse. 

Like a lot of technology, deepfakes cut both ways. They are impressive from a tech standpoint and concerning in equal measure. And like a lot of tech innovations, it comes down to how people choose to use them.

With deepfakes, there are still more questions than answers.

Facebook Comments