In the premiere episode of HBO’s new series Westworld, viewers are (re)introduced to the concept of a western-themed amusement park where you can fulfill all your Wild West fantasies, be it an orgy with saloon girls, going on a demented shooting spree, or helping the town sheriff capture a known bandit. Your adventure can be as mucky as Deadwood or as wholesome as Bonanza. As bloody as The Wild Bunch or as bloodless as My Darling Clementine.
All of this is possible thanks to the lifelike robots called “Hosts” which are essentially Disneyland animatronics that behave like video game NPCs. They follow a predetermined script based on their roles, unless prompted by park guests to go down a hundred other different scripted possibilities (Showrunners Jonathan Nolan and Lisa Joy described it as a cross between the open-world video game Red Dead Redemption and the interactive play Sleep No More).
The show is a reboot of Michael Crichton’s techno-paranoid 1973 film of the same name, but takes on the opposite point of view. The central character in the HBO show is not a park guest, but a Host named Dolores, played with impeccable calibration by Evan Rachel Wood, whose role is that of a farmer’s daughter repeatedly crying over the many heroic deaths of her chivalrous beau Teddy, played by the affable but Ken doll-like James Marsden.
When the series begins, decades after Westworld has already been in operation, some of the Hosts had just received a software update that includes something called a Reverie code, which allows them to observe small human gestures (like running your finger across your lip), remember them, and commit them into their behavioral programming, even after their memory gets wiped at the end of their story cycle. We see the unintended consequence of this code in those Hosts, particularly Dolores and Teddy, when they begin to remember things they shouldn’t.
Long ago, computer science pioneer Alan Turing developed a test wherein a conversation between a human and a computer is examined by a blind third party evaluator who then tries to deduce which side of the dialogue is the computer. This “Turing Test” gets name-checked nowadays in every sci-fi story about artificial intelligence, as a way to establish the storied machine’s level of sophistication in mimicking humans. The Hosts in Westworld, however, while possessing a form of intelligence, are not meant to pass this test. As glorified animatronic theme park attendants, they stick to roleplay and are programmed to never break the fourth wall with park guests. They are not meant to ever be confused with a real person.
There was an intriguing submission about the Turing Test a month ago in /r/ShowerThoughts, a subreddit meant for sharing random philosophical musings that occur to you while you’re doing mundane things. User Grandure posted, “I’m not scared of a computer passing the turing test… I’m terrified of one that intentionally fails it.” Buried under other replies to the thread is a comment by Jonathan Nolan that, amusingly, went completely unnoticed by Reddit users until after the premiere. Understandably so, given that it’s a disposable comment that means nothing if you didn’t know he was the guy writing a Westworld reboot. Nolan posted, simply, “Boy have we got a show for you!” On the surface, it’s just a cheeky reference to the “Boy have we got a vacation for you!” catchphrase from the original movie, but after seeing the show, it appears to echo the chilling exchange at the end of the premiere.
Stubbs, the park’s head of security played by Luke Hemsworth, interrogates Dolores in her backstage diagnostics mode following an unusual encounter with a few malfunctioning Hosts. Stubbs asks suspiciously if Dolores has ever lied to him. She says no. This question is known as a Liar’s Paradox in philosophy (a liar would say no anyway, but saying yes confirms that you are one). It’s used often in robot stories; most famously in an episode of the original Star Trek where the Enterprise is hijacked by a fleet of androids and Kirk uses the paradox to cause a logic failure that shuts them down.
If we view Stubbs asking her that question as a form of Turing Test, Dolores saying no is the only possible answer that demonstrates her as a script-reciting automaton, since a human mind (or the imitation of one) would likely recognize the paradoxical nature of the question. Furthermore, what tells us where she truly falls in this philosophical conjecture is when Stubbs asks Dolores if she would ever kill a living thing. “No, of course not,” is her answer, shortly before we see her kill a fly. A lie.
Flies are a recurring motif throughout the premiere, used as a marker to show the progress of deterioration in the Hosts’ programming. In Westworld, all the animals are robots, too, so the only living thing other than the park guests are the bugs. Early on, we see Teddy and Dolores unbothered by flies walking across their face. But later in the episode, the Sheriff breaks down mid-sentence when a fly suddenly lands on his cheek.
The reason? Shortly before the Sheriff fritzes, the camera lingers on an annoyed park guest swatting flies near her face. That’s a common human behavior that the aforementioned Reverie code would presumably implore Hosts to imitate, but the head programmer played by Jeffrey Wright insists while diagnosing the Sheriff that Hosts, in true Asimovian fashion, “literally cannot hurt a fly.”
Whereas the two opposing codes gave the Sheriff a crash not unlike that android in Star Trek, for Dolores the Reverie code has somehow managed to supersede the First Law of Robotics.
So the frightening takeaway from all this is that, if Dolores is capable of passing a Turing Test, she has also decided to fail it on purpose in order to conceal her own autonomy. Nolan’s right: Westworld is exactly the show for that Reddit user.
This is what’s exciting about the show’s potential direction. At this point, the Hosts turning on the guests like they did in the movie is the least interesting route they can go down. The morality of the original film is a rather straightforward story of the hubris of man that was better argued by Crichton when he reused the concept two decades later as Jurassic Park. It’s a lot more fun to play with the show’s idea of robots becoming a problem not so much because they’re becoming independent, but because they’re getting better at simulating human behavior. There was a real world example earlier this year, when Microsoft’s self-learning artificial intelligence Tay morphed into a trans-bashing anti-feminist within 24 hours of coming into contact with Twitter users.
In many AI stories, a recurring question is about the ethics of humans harming machines, but the effect of that act on the human characters is often a secondary concern (if at all) to the machine’s personhood and freedom. The juicy thing about Westworld‘s theme park setting is that the ethics is intrinsically the premise. The park’s resident scriptwriter says as much at one point in the episode, that Westworld can only be enjoyable to guests if the uncanny valley is never crossed (“Do you want to think that your husband is really fucking that beautiful girl, or that you really just shot someone?”). Playing into the meta nature of the show, the same challenge is posed to the viewers.
Rape as a plot device is so overused for female protagonists that it’s become a cliche, and Westworld received some deserved pressing from critics at the Television Critics Association screening a few months ago, for immediately jumping to that direction with its main character. That said, while it’s definitely still crudely trafficking in that tired trope for dramatic purposes, the interesting thing about its use in this particular instance is how it’s presented.
The scene occurs not ten minutes into the episode, shortly after an early twist that establishes the show’s level of reality (“I didn’t pay all this money because I want it easy.”), but before viewers even got the full exposition of the show’s premise or be introduced to the idea of Dolores being a robot that has her own agency. The show essentially asks the viewers to make a snap decision right then and there if what they’re seeing is rape or mechanically-assisted masturbation. Did Evan Rachel Wood play the scene with convincing enough horror for the viewers to buy that Dolores really felt like she was being assaulted, thus deserving of personhood?
It’s a demented and unsavory form of a Turing Test, and you’re the evaluator.
Disclaimer: This article was entirely written by an algorithm.