Deepfakes: The Dawn of the Post-Truth Era

Technology
Photo Credit: DerpFakes/YoutTube

Shelly Palmer

For about 200,000 years, modern humans have relied on our eyes and ears to separate truth from lies and fact from fiction. Even if we ignore the rise of fake news (and how difficult it is to do anything about it), technology (like deep learning) is on the verge of making it impossible to know if what you are seeing and hearing is real or fake.

What Is Deep Learning?

Deep learning, a subset of machine learning, is loosely based on the way engineers and scientists believe biological brains work. The “deep” in deep learning refers to the number of layers (depth) of data transformation. If you want to go deep on the subject (pardon the pun), search “credit assignment path” and pick a scholarly article that meets your needs.

Also known as hierarchical learning or neural networks, deep learning systems learn by trial and error. Said differently, as opposed to being explicitly programmed to recognize a particular object, a deep learning system will learn how to recognize an object after a number of attempts. Sort of the way a newborn baby picks up a rattle and tries to eat it, then bangs it on the floor, then tries to break it, then bangs it on its head, then shakes it. Each time, the baby is learning what the rattle is, and what it isn’t. (You can start learning how to build deep learning tools right now by checking out Google’s AI Education site. It’s just one of thousands of sites filled with free exercises and information.)

Driving Innovation and Productivity – The Light Side of Deep Learning

Deep learning can be used to identify nearly anything, like determining cats from dogs in images or identifying whether a photo of a kitchen features a Kohler or Moen faucet. Companies like Instagram use deep learning to detect (and prevent) bullying in photos and captions. Stanford University used deep learning to parse 2 million medical records to predict when terminally ill patients will die (which it does with 90 percent accuracy). A neural network at MIT studied more than 60 hours of musicians playing various instruments and learned to identify more than 20 instruments that can be isolated with a mouse click. And Nvidia used deep learning to “fake” slo-mo videos and create the 210 necessary frames to slow a standard 30 fps video to a 240 fps slo-mo video. There are an unlimited number of good uses for deep learning systems. They have the power to truly change our world. But maybe not the way well-intentioned engineers intend …

Deep Fakes – The Dark Side of Deep Learning

Considering that one of the first artificial life forms created by human beings was a computer virus, it should not surprise you to learn that there are some seriously bad actors using deep learning.

Last December, someone used a deep learning algorithm to take Gal Gadot’s face and paste it onto an adult actress’s body in a porn video. The resemblance is close, although not close enough to fool anyone looking, ahem, too closely. The video came from a redditor named “deepfakes,” who used open-source machine learning tools (like Google’s TensorFlow) to create the video.

It’s far from the first time we’ve seen this sort of thing. We’ve seen deep learning used creatively in Hollywood when the movie Rogue One de-aged Carrie Fisher and the movie Captain America: Civil War de-aged Robert Downey, Jr.

There’s an inherent difference, however, between de-aging an actor to play a younger version of a character they’re known for portraying and pasting a celebrity’s face on an adult actress’s body – but is there really a difference? To the untrained eye, both types of images look real. What does “real” mean?

To be fair, deepfakes has also done some less offensive deep learning fakery, like putting Nicolas Cage in Raiders of the Lost Ark. Last week, someone inserted Harrison Ford into the movie Solo, as though swapping the lead actor in that movie would have made it any less of an insult to Star Wars fans. (I digress.) That video came from YouTuber “derpfakes,” who has been face swapping actors into movies for most of 2018.

What’s Next?

We’re rapidly approaching a point where actors will have a new source of revenue. Sooner than later, instead of offering Robert Downey, Jr. $50 million to play Iron Man, the movie studio will offer him a nominal licensing fee to use his likeness applied to an anonymous actor. Sound crazy? It was the plot of the film The Congress, which came out five years ago.

As technology improves (and it is improving at an exponential pace), it will become harder for us to discern what is real and what is fake. There’s a difference, both in technical difficulty and in ethical responsibility, between making a public figure say whatever you want to them to say (remember Talk Obama To Me?) and swapping a celebrity’s face with an adult film actress’s face.

Deep learning tools are becoming easier to use and more readily available every day. It’s only a matter of time before this technology is as easy to apply as an Instagram filter. When everyone is one tap away from creating an alternate reality, we will be firmly living in the post-truth era.

Was the video of that murder real or a deepfake? Was the person captured on that security camera actually the person? What if someone were to deepfake you into a revenge porn video? Sadly, there are an unlimited number of ways to use deepfake technology to change the arc of a story (and to alter the truth).

Remember, deepfake technology works with both audio and video. No voice is safe. No image is safe. Incorporate AR (artificial reality) techniques into this “end of the truth” scenario, and we simply will not be able to tell reality from fiction. If you can’t believe your eyes and your ears, what can you believe?

In 1984, George Orwell’s prescient novel about a dystopian world, the main character works at the Ministry of Truth. His job is to rewrite the daily news and then adjust historical records to support the goals of the state. When I first read the book, I thought it was written to scare me into supporting the cold war or just a brilliantly written hyperbolic view of life under totalitarian or authoritarian rule. I never imagined any of it was possible to do. Rewriting history, changing the pictures and the text, swapping out the audio, and manipulating the daily news were science fiction in Orwell’s day. Today, these capabilities are just over the horizon.

What will happen when we weaponize alternate reality? Will it redefine what it means to be a superpower? Will it be used to manipulate the stock market? Wage psychological war on a global scale? In a world where we can’t tell the difference between reality and fantasy, what is truth? In the post-truth era, whom will we believe?

Author’s note: This is not a sponsored post. I am the author of this article and it expresses my own opinions. I am not, nor is my company, receiving compensation for it.

Categories
Technology

RELATED BY

0