Westworld has inspired some of the best, most engaging (mercifully apolitical) conversations I’ve had in a very long while. The series is both entertaining and thought-provoking. If you haven’t seen it, Westworld is an HBO series, based on the 1973 Michael Crichton movie, about a futuristic amusement park where people can interact with synthetic humans (robots that are indistinguishable from human beings) and do pretty much whatever they want with them.
Is Sex with a Robot Cheating? Yes or No?
There are quite a few articles that explore the ethical and moral issues arising from the advent of this type of technology. This article is not one of them. This article is about the incremental technological steps between where we are now and where this is clearly going.
So if you want a yes or no answer to the question, “Is sex with a robot cheating?” you have to reword the question thusly: “Is sex with a surrogate, synthetic human with a convincing amount of AI capabilities cheating?” Asked that way, mostly everyone I’ve discussed this with would quickly say, “Yes.” Which raises the question, “When is this going to become an actual problem, as opposed to a hypothetical one?”
The difficulty in answering “when” is complicated by the verb “to be.” In the series, we (the audience), they (the humans who visit Westworld) and the Hosts (the synthetic humans) do not know if the Hosts are sentient. If they are sentient, then we have created artificial life. That technology will come with its own set of remarkable problems. (We’ll explore artificial life in a different article.)
That aside, if they are not sentient, then they are just mechanical devices. But this line of reasoning has gotten me into a bit of trouble.
Incremental Steps to Human-on-Robot Relationships
“Real” Westworld-style Hosts are the stuff of science fiction. (We’ll discuss what it might take to build one in a minute.) The point is, you don’t need anything like a Westworld-style Host to create a real world of ethical, moral and sexual issues.
In 2013, Joaquin Phoenix starred in Her, a movie about a man who falls in love with a natural language processing (NLP) system named Samantha. According to the Consumer Technology Association, in 2013 NLP systems could understand approximately 3 out of 4 spoken words. Today it’s just shy of 4 out of 4. That is an amazing amount of technological ground to cover in only three years.
Why do we care about NLP? Well, at the current pace of technological change, the science fiction female NLP system that Joaquin Phoenix fell in love with in the movie is just a few years from becoming science fact. We’re just waiting for AI to improve a bit – not all the way to pseudo-consciousness (however that may be defined), but just good enough to fool you. (Do some research about the Turing test if you’re truly interested in what I mean by “fool.”)
Can you fall in love with a computer? Can you have phone sex with it? La-la-la-la, I can’t hear you … don’t answer; they are rhetorical questions. But I have to ask, is a relationship with a non-sentient NLP system cheating?
When trying to qualify what “falling in love” means, most people pointed out, “If you need to fill an emotional void in your life, it doesn’t matter if you’re doing it with a human, a machine, a hobby or your work. If your partner cannot fulfill your emotional needs and you don’t disclose this reality to them, you’re cheating.”
You may or may not agree with that consensus, but this is only armchair research. I’m sure there will be many professional ethicists and scientific researchers exploring the full range of human emotions involved.
The Next Level
The adult entertainment industry is evolving Virtual Reality (VR) much more quickly than it evolved streaming video or webcams. There are several companies working as quickly as possible to combine VR, NLP and AI to create convincing sexual experiences. Following this logical progression, would a relationship with a non-sentient system that combines VR, NLP and AI be considered cheating?
Some Guidelines Are Required
Are you having trouble placing the line that needs to be crossed? Do you have a private line as well as a line that you would prefer to present to the outside world? If we’re talking about a device that is not sentient or conscious – a machine – is there even a line? I’ll let others tell you how to think. I’m just going to tell you about the technologies that are here today or just over the horizon.
From a technologist’s perspective, I can say with certainty that we are very close to creating systems that will cause us to rethink and redefine some of our most sacred ideas. Terms like relationship, commitment, fidelity, monogamy and many, many more are about to evolve. In practice, we don’t need anything like a real Westworld-style Host to cause all kinds of pain, anxiety or worse.
Remember that talk you had with your parents when you were just starting to notice the opposite sex? It’s time to think about how that talk is going to sound when you add loving, committed relationships with an AI system to the mix.
Unless the machine identifies itself as a machine, a child who is between the ages of five and eight years old today will not be able to tell if they are speaking to a person or a machine by the time they are at an age where they are seriously dating. At the rate the technology is progressing, it will take less than 10 years to get there – probably way less.
Want to think about some potentially mind-bending consequences of the advance of machine learning and NLP systems? I wrote an article in May 2016 titled “I’ve Talked to the Future and It Talked Back.” Here’s a question you should be asking yourself today: “Should you treat your household NLP interface (Alexa, Google Personal Assistant, etc.) like a person? Should you be polite when you speak to it, or is it OK to be abrupt or even abusive? The devices won’t care. They don’t have feelings; they are computers that have been programmed to speak with us. Will we care? How will we teach our children to differentiate between machines that sound and act like people, and other disembodied voices that are actual people? Will a child (or a grown-up) know when a customer service representative is a person or an NLP interface? In the very near future, it will be almost impossible to tell.”
That’s Great, but When Will We Get Westworld-Style Hosts?
There have been countless hours of chatter between my professional friends and colleagues who work in and around the technologies we believe would be required to create real Westworld-style Hosts. I say “believe” because if we were to start building one today, we could only use the component technologies we have now: current chipsets, power sources, machine learning, data science, robotics, nanotechnology, regenerative reliquary (bio-printed scaffolding seeded with stem cells or something like stem cells), some kind of chemistry to blend collagen with synthetic fiber or a fully synthetic substitute for collagen, and on and on.
But that’s today. If anyone were to start this journey with any hope of making a device not only that was indistinguishable from a human but also that could fool another human into believing it was alive, they’d probably be better off waiting for a more advanced set of building blocks.
Assuming there is a reason to actually build Hosts, they will probably be born in the last half of this century. Could it be sooner? Of course. On the other hand, get ready to discuss relationships with machines realistically, because the subject will not be hypothetical much longer.
About Shelly Palmer
Named one of LinkedIn’s Top 10 Voices in Technology, Shelly Palmer is CEO of The Palmer Group, a strategic advisory, technology solutions and business development practice focused at the nexus of media and marketing with a special emphasis on augmented intelligence and data-driven decision-making. He is Fox 5 New York’s on-air tech and digital media expert, writes a weekly column for AdAge, and is a regular commentator on CNBC and CNN. Follow @shellypalmer or visit <shellypalmer.com> or subscribe to our daily email <http://ow.ly/WsHcb>.