Typically, when deepfake technology makes headlines, it’s not good news. However, the widely criticized artificial intelligence (AI) tool, which has the ability to create uncannily realistic fabricated videos, was recently used to bring attention to a worthy cause.

Nonprofit organization No More Malaria hired advertising agency R/GA London to produce an awareness-raising advertisement. In turn, the firm worked with soccer superstar David Beckham to create an innovative commercial highlighting the charity’s new campaign.

In it, Beckham appears to speak nine different languages when talking about the disease’s devastating global impact. However, the renowned British athlete is not a linguistic genius. R/GA London paired Beckham’s image with the voices of nine real-life malaria survivors using a deepfake program called Synesthesia AI.

Advertisement

A Positive Use of Controversial Technology

The British promotional company didn’t utilize cutting-edge deep learning software on a lark. No More Malaria is launching the world’s first audio petition to bring more attention to the global malaria problem. The commercial personifies the campaign’s concept by having Beckham speak for the hundreds of millions of people the illness affects every year.

Undeniably, R/GA London and Synesthesia AI did a fantastic job creating the anti-malaria PSA. The spot is concise, novel, and emotionally impactful without being overwrought. Plus, the company didn’t use Synesthesia to fool the viewer into thinking Beckham was speaking Arabic, English, French, Hindi, Kinyarwanda, Kiswahili, Mandarin, Spanish, and Yoruba.

Instead, the agency dubbed audio from malaria survivors of different ages and genders to the English sportsman’s AI-generated mouth movements. Consequently, the commercial comes off more like an impressive magic trick than a fake news piece.

Since its release on April 9, No More Malaria’s David Beckham PSA has racked up 34,000 views. Notably, the charity’s 16-month-old YouTube channel only has 185 followers. As such, the deepfake-powered spot clearly helped the organization communicate its message to a wider audience.

The Difference Between Tools and Weapons

Despite the fact that malaria killed more than 435,000 people in 2017, deepfake critics probably won’t approve of the PSA.

Since the technology’s popularization two years ago, deep learning human image synthesis programs have been extraordinarily controversial. Admittedly, the public’s first exposure to deepfake AI was not ideal. In December 2017, Motherboard reported it was used to make eerily convincing fake pornography featuring Hollywood stars like Gal Gadot.

Public perception of deepfake tech improved the following year when its comedic potential was demonstrated. In early 2018, video editors began using the technology to make hilarious parodies. One popular remixed video saw Nicholas Cage replace Harrison Ford in “Raiders of the Lost Ark.”

But the laughing stopped when BuzzFeed partnered with Jordan Peel to create an uncannily realistic video of President Obama swearing up a storm. The fear mongering clip illustrated deepfake’s utility at generating fake news videos. Soon politicians like Senator Marco Rubio were calling the AI tool a threat to national security.

People’s anxiety about deepfakes continued to grow in 2019. In February, machine learning company OpenAI announced it had developed a handwriting mimicking program so effective it could never be publicly released.

It’s About Use

While the danger posed by bad actors using deep learning applications to spread misinformation or commit fraud is worrisome, something essential has been lost in the deepfake conversation. Namely, most technological advances can’t be described as good or evil, but the way humans use them can be.

Accordingly, terrorists might use deepfake AI to undermine the public’s trust in the media and government officials. But before any new laws are written, it’s worth remembering it can also be used to help fight a disease that affected 219 million people in 2017.

Facebook Comments