**Spoiler warning for season one of Westworld, Humans
and the 2015 film Ex Machina.**
Any fan of science fiction is familiar with the artificial intelligence. It is a common theme and nowadays it has been played with to death. For the last seven or so years, the idea that a brilliant male—often white— scientist creating a gorgeous female robot with the capacity of human consciousness and falling in love with that creation has dominated recent pop culture. I think that is the case for three reasons:
- Men aka humankind like to play God.
- Men (mostly male scientists from movies) want to have complete authority over something they create.
- Dating can be hard and your creation will love you unconditionally.
But that is the problem. There is a big difference between loving and respecting your creation and wanting to use your attractive robot for sex or exploitation. In season one of Westworld, season one of Humans and the Alex Garland film Ex Machina, those two concepts are thoroughly debated. Westworld and Humans explore the concepts through a large cast of characters but they both conclude with the idea that if the creation knows that it is a creation, it really isn’t an object anymore. It has agency, desires and the ability to consent. I briefly want to look at Ava from Ex Machina, Maeve from Westworld and Niska from Humans.
Ava is probably the most intelligent example of a conscious robot. Throughout the film, she manipulates programmer Caleb (Domhnall Gleeson) by pretending to fall in love with him after a series of conversations to prove that she has consciousness. Unbeknownst to him, she was plotting to escape her creator Nathan’s (Oscar Isaac) high tech estate. Garland’s film makes it known that Nathan does not value Ava or her Japanese counterpart Kyoko. He sees them as objects to be enjoyed. Caleb does not but that does not save him. By end of the film, Ava kills Nathan and leaves Caleb behind even though he “valued” her.
In the case of Maeve and Niska, these two A.I. women were repurposed into sex workers. That in itself is problematic. Maeve was a mother and settler before becoming a madam at a brothel. As her consciousness began to awaken, she wanted her old life back. She wanted her child back. The problems began to arise when Maeve realized that she did not consent to being a sexual object. Knowledge of her past self and a desire to choose made her more than a robot. It must be noted that Maeve did offer her body to the Westworld staff so that they can help her escape.
The same thing goes for Niska. She was created to act as a companion and friend to Leo but her creator George Millican violated her and used her as a sex object. She was then repurposed into a sex worker, put to work in a brothel after Millican’s suicide and turned on a customer after realizing that she doesn’t have to be that.
The point I am trying to make here is that in both cases the hosts in Westworld and the synths in Humans were created to serve as objects for pleasure or for day-to-day tasks for humans. However, if the synths and the hosts gained the ability to say no, set out to find their own purpose and reject the idea of being used—they are conscious and no longer robots.
When it comes to the real world, sex robots are in the process of becoming a reality bringing along with it questions of consent, consciousness, and ethics. A few weeks ago, The Guardian released a documentary chronicling the rise of the first sex bots. And it turns out that we are closer than ever to having a real life talking sex bot that will be used to fulfill men and other genders needs. After watching the short doc, I see no ethical issues yet —the keyword here is yet— because the tech has not advanced far enough to create a robot that wants to have its own identity and/or purpose. But when that happens, we should revisit the question.
Take a look at the documentary and judge for yourself.