Violence and empathy in “Westworld”: “That’s not the point we want to make, that it’s OK if they’re a robot”

Only two episodes of HBO’s “Westworld” have aired, and already it has harnessed audience fascination and become one of the season’s most talked-about shows. While it has yet to earn an official renewal, the premiere attracted a healthy 3.3 million viewers in its first two airings and via streaming, with 2.7 million tuning in for the second episode.

That’s promising news for series creators Jonathan Nolan, who created and executive produced CBS’ “Person of Interest,” and Lisa Joy, who has previously served as a producer for “Burn Notice” and “Pushing Daisies.”

The husband-and-wife team has only begun to explore a number of heady themes within the confines of the show’s Western-styled park. But the topmost one is existential, as the series asks the audience to consider what exactly it means to be human. For his part, Nolan spent five seasons of “Person of Interest” exploring the possibilities and dangers inherent with artificial intelligence, and his fascination with the subject continues with “Westworld.”

Inspired by a 1973 cult sci-fi film, the show centers around Westworld, a theme park that caters to rich men and women unfettered by the rules and etiquette imposed upon them in their normal lives. It’s a fantasyland populated by realistic androids, known as hosts. The hosts are actors and servants, existing to lead attendees on quests or satisfy their basest urges. Guests are even free to harm or murder the hosts, with the knowledge that staff will wipe their memories, patch them up and reintroduce them into the park.

Nolan and Joy play with that aspect of “Westworld” to invite viewers to ponder the emergence of artificial intelligence — how we interact with it, what role it plays in our lives and its impact on our morality. And what better way to do this than in a live-action role playing experience?

For gamers, “Westworld” also inspires a new consideration of their relationship with nonplayer characters. As they researched the script, the couple played a number of video games, including “Grand Theft Auto.”

“She obeyed all of the stoplights!” Nolan said, laughing.

“I’m a terrible gamer,” Joy admitted. “I didn’t want to hit anyone. I [respected] the streets, but I’m terrible. At the same time, I’m writing this show.”

Recently Salon sat down with Joy and Nolan at an HBO press event in Los Angeles to talk about what their sci-fi Western says about humanity, its portrayal of violence and the tragic plight of the nonplayer character. (This interview has been lightly edited for length and clarity.)

What made you want to expand the world of the film into a full-fledged TV series?

Jonathan Nolan: There are two aspects of it. One, this idea of simulated realities. As we were making the show, and we were working on the show for a couple of years, but [there were] huge strides during the time in both of the topics that the show really deals with: one, simulated realities. Obviously in the show we present a reality that’s not a simulation . . . but the idea of curated play space, gaming becoming not just a medium but a lifestyle and something that is a much more significant phenomenon.

Then, artificial intelligence. The two are inextricably linked if you’re building a game these days, a role-playing game. Whether it’s online or on a console, you’re dealing with artificial intelligence, you’re dealing with game play.

We were really taken with the idea of telling a story from the perspective of an emergent AI, a creature who didn’t really understand that the joke was on them or that the parameters of their world were not exactly what they thought they would be.

Lisa Joy: For me, it was the chance to write a frontier story on two levels. There’s the Western frontier, the kind of classic notion of the genre, people against the kind of lawless land that’s still being tamed a little bit. That genre has appealed to so many people for so long because it’s really about man versus the abyss.

And against all of that nothingness, against the lack of law and order, you really see who you are. Within it, the hosts are dealing with that search for identity and what their own moral code is, just as the human guests who visit and just as the scientists who work on them are trying to find the law of what their responsibility is as a scientist.

The other frontier aspect of this, of course, is the science aspect. . . . It’s a kind of new frontier of science and it happens to be the frontier we are at this very day in our own lives with the emergence of AI getting more and more sophisticated.

Westerns are a difficult genre to sell to audiences on a large scale these days.

Joy: Yeah. This was, for me, an opportunity because I wasn’t the target audience for a Western. But I saw in this synthetic Western an idea that just immediately excited me.

I thought of the genre, and I thought of all the possibilities of all those people who are just footnotes in all those Western stories. The women who were either just the damsels or the whores, the sidekicks who never got their own real story. What about those stories? To enliven those stories and those points of view and then to put them in a world that’s a Western — but not really because it’s a synthetic Western. And you get reminders of that constantly.

The most surprising aspect to me was this whole idea of evolving not just AI but, as you’re talking about role-playing games, these increasingly sentient nonplayer characters. That makes watching “Westworld” a really queasy experience at some points.

Nolan: I saw a cartoon online a few years ago, and it was the world from the perspective of a nonplayer character. It was a loose riff on “Skyrim.” It was a fantasy world where two NPCs are standing at the food vendor or whatever, and they see Player 1 walk past. Everyone says, “It’s Player 1! It’s Player 1!” One of them says, “I don’t know. I have this slightly weird feeling about him.” You see this little quick save window pop up, and then Player 1 lays waste to all the NPCs.

Then they’re reset and put back in, and the NPC is still saying, “I don’t know. I have this weird feeling.”

As Lisa said, we want to jump into the NPC characters who are kind of relegated to supporting role status and see what would happen if they decided to make themselves the stars of the show. What would happen if they realized their own limitations and started being able to alter them? That’s the long-haul story of the series.

What measures were taken in the writers’ room to ensure the story maintained a human element, both in terms of the actual newcomers coming in and in terms of the personalities given to the robots, the hosts?

Joy: It’s something that you deal with in fiction and games and also in life, the ability to de-personify and abstractify the other. . . . It’s just the way humans work. We are egocentric by design. Our empathy is not at the same level in everyone. Neither is kindness. That’s something we’re kind of exploring within this series — how human nature works, and where you form that connection.

For instance, with Dolores, [a host played by Evan Rachel Wood,] and her plight, I really empathize with Dolores personally. That doesn’t really matter for the show. It will be received by different people in different ways. They will feel different connections with different characters. For whatever it’s worth, I felt a real connection with her character.

Sometimes when I talk to other people, especially as we were trying to gauge how human to make her — and when we let those little signs of inhumanity show . . . then it kind of wouldn’t matter to those people what happened to Dolores. It was an interesting kind of Rorschach [test] to people as we talked about it and responded to it. We realized that we had to be very, very careful and very, very modulated in how we told these stories and how we portrayed these characters in order to give them their personhood.

Regarding Dolores’s assault, that whole idea of it happening over and over again is sickening. But there’s also been a lot of conversation about it in the context of what’s going on with HBO in general. One of the things said in response to that point [by HBO’s programming chief, Casey Bloys,] was “Well, but Dolores is a robot.”

Nolan: That’s not the point we want to make, that it’s OK if they’re a robot.

Yes, but …;

Nolan: It’s not OK, but that is one of the questions that the show is asking. In the world of our show, they have created a space in which people can go and behave however they want with no consequences. That, of course, is exactly “Grand Theft Auto.” It’s more sophisticated. The NPC’s AI is more sophisticated. We’re asking this question . . . at what point does this become problematic? At what point does this become abhorrent? You don’t feel any guilt.

It’s really an examination of empathy because there’s a difference between playing those games and simply watching violence onscreen.

Joy: Yeah. There’s a difference between writing it and seeing it on the screen later, too. When it’s in your head, you know it’s fake. Even seeing it in production, you’re there and everybody’s doing it, and it’s different. Then you see it on the screen, and . . . it’s hard.