Late last year, I wrote about deepfakes and how for about 200,000 years, we have relied on our eyes and ears to separate truth from lies and fact from fiction. Even if we ignore the rise of fake news, technology is on the verge of making it impossible to know if what we are seeing and hearing is real or fake.
It’s been less than a year since I wrote that article, and the technology has advanced in ways that make last year’s fakes look primitive. Here’s a roundup of the advancements and uses in the past year.
Already Illegal: DeepNudes
Revenge porn has been outlawed in many jurisdictions around the globe. In July, Virginia became one of the first states to outlaw the sharing of computer-generated pornography aka deepfakes. It did so by amending an existing law which criminalizes revenge porn, making clear that the category now includes “falsely-created” material.
Scary: DNC Deepfake
The Democratic Party deepfaked its own chairman to highlight 2020 concerns. In early August at DEF CON – one of the world’s biggest conventions for hackers – attendees were told DNC Chair Tom Perez couldn’t make it for the DNC’s presentation. In lieu of that, he “Skyped in” and chatted. Except he didn’t. Do you know what Perez’s voice sounds like? Neither did those in attendance.
Bob Lord, the DNC’s chief security officer, led the exercise (with the help of experts in the field of AI) as a way to demonstrate the art-of-the-possible. This is what’s coming leading up to the 2020 election. The video speaks for itself.
Fun: “Fixing” The Lion King
It’s important to remember that not all deepfakes are malicious. After Disney purists didn’t like the (stunning) computer graphics in this summer’s release of The Lion King, they “fixed” the trailer by melding the original (animated) version’s character faces onto the new version’s animals. The resulting footage is so well done that it takes a second to realize what you’re looking at.
Somewhere In Between: FaceApp
In case you’ve been living under a rock this summer, or simply took a vacation from the Internet, FaceApp is an app on Android and iOS that uses AI to “generate highly realistic transformations of faces in photographs.” Basically, FaceApp puts the power of deepfakes into your hand, making it a snap to make a person in a photo look older, look younger, or change genders.
Aside from the mild controversy – Fast Company discovered that every photo uploaded to the app was sent to FaceApp’s cloud servers, (which you agreed to when you accepted the app’s terms and conditions, but clearly did not read) – the tech here is remarkable. FaceApp has been offering its photo effects since early 2017, but the results have gotten much better in the years since it launched.
While FaceApp has limited features, it’s one of the easiest and most popular deepfake creators available to the masses. Its speed and impressive results make it fun … and also dangerous.
What Does This All Mean?
The technology behind deepfakes was already easy to find and use last fall, but it’s even more so today. Imagine where it’ll be next summer when the 2020 election is four months away.
Did Elizabeth Warren really say that? Was that footage of Bernie Sanders in that attack ad real, or was it a deepfake? As political advertisements already twist candidates’ words and manipulate the truth for the perfect soundbite, can you believe anything you hear when it can all be manufactured on any laptop you can find at Best Buy?
It Gets Worse
While news cycle after news cycle covered allegations of possible social media manipulation prior to the 2016 election, is anyone ready for what comes next? Let’s quickly imagine how an AI-assisted system will exacerbate the problem.
Go to ThisPersonDoesNotExist.com. See that face? That lifelike photo? That’s not a real person. The site “showcases fully automated human image synthesis by endlessly generating images that look like facial portraits of human faces.”
Fun! Right? Only until you think about the possible ramifications. How much work would it take to build an entire history and lifetime for this person? Use AI to generate the person’s name and backstory (maybe seed the system with an obituary so it could find their living relatives on social media, alongside a cache of “real” pictures)?
Create an email address, Facebook account, and Twitter profile. Use FaceApp to age the photo up 30 years, 60 years. Photoshop the character’s likeness into his or her (real) family photos. Do the same in other photos.
Use AI-assisted software to generate tweets and Facebook posts. Use procedural generation to make the character’s posts seem real, and not scheduled. Have the character retweet things. Reply to things. Like things. Interact with the world. It’s like the “Yahoo Boys” military scam, but automated and amplified to the nth degree, powered by technology and by machine learning.
Build a generative adversarial network (GAN) to create retinal scans and fingerprints (the same way fake faces are generated on thispersondoesnotexist.com). Snag some social security numbers off the dark web, and … Poof! We’ve created a virtual person that is practically indistinguishable from a real person. It would take quite a bit of research to figure out that this fabrication was not a living, breathing person.
Use headless chrome to create a few thousand accounts per day (I’m not going to tell you how to mask your network connection, but you’d easily do that).
Bias the AI “red” or “blue.” Or, if you choose, “anarchist” or “union organizer,” or make your created person appear sympathetic to whatever cause serves your purpose.
Train the AI to follow “real” accounts (or fake ones that you’ve created). The AI-generated tweets and retweets or Facebook posts and shares are going to look (and be) real. They just won’t be from humans.
Activate your influencer network of a half-million accounts. And wait for November 3, 2020 and see how you did.
You may not like this game, but it is not going to be played by you or me. It’s going to be played by nation-states with unlimited resources. Buckle your seat belt. This is going to be a rough ride.
Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it.