WARNING: CONTAINS SPOILERS. Critic James Cox attended a press conference with the cast and crew of Ex Machina. Here he and fellow journalists spoke to actors Domhnall Gleeson and Alicia Vikander, Director and Writer Alex Garland and the films science and robotics experts Murray Shanahan and Adam Rutherford…
(All questions marked with a * are by James Cox. Spoilers are in red.)
Q&A with science adviser Adam Rutherford and robotics adviser Murray Shanahan
What did you mainly consult about?
MS: My main influence on the film was before I even started consulting. Alex had read a book I’d written about consciousness and the brain, and this seems to have had some influence of the script. We met up and I read the script, and I think he was mainly interested in whether or not it all made sense, particularly the philosophical issues. Of course it was absolutely brilliant, so I was impressed right off – I had a few very minor corrections and comments, but in the end I mainly said “yeah great”.
AR: I came in later; I read it through and produced about two or three pages of notes, mostly on trivial things. I think what Alex wanted me to do was make sure these things sounded right, they sounded like the types of things scientists would actually say. I’ve worked on a few films, and that’s the first time that has happened.
MS: Having plausible conversations that these guys would say.
AR: Yeah, so you don’t end up with, like in 2012, scientists saying “the neutrinos are mutating” and you just think “seriously?”
There isn’t an ethical point of view, other than that of a scientist. Is that one of things you found in the script, or has that been enacted with your help?
MS: No it was there in the script. I think there are lots of different ethical points of view that you can take to the film. You can start to ask what’s the overall ethical point of view of the film, and its delightfully ambiguous. Because Alex is such a very clever writer, you can interpret it in lots of different ways – who is good and who is bad in this film? It’s not very clear. Then the different people have very different ethical points of view. Nathan, who created her, has a particular ethical point of view that we may find disagreeable. Caleb has his point of view towards Ava which develops throughout the film, you can see him as misguided or not. It’s a complex film, in terms of all of those issues and that’s was all there in the script.
AR: I think that they all play different roles in this three-way. We are Caleb, we’re looking at this slightly bemused and amazed and ultimately really quite seduced by Ava, because she’s astonishing.
MS: Well that depends on who we are, right?
AR: Well, me. And by me I mean we! (laughs) I said that at the BFI thing didn’t I?
MS: You identify with Caleb.
AR: Yeah, mostly by massively fancying a robot! (laughs) But, yeah, Nathan’s an arsehole, and that’s down to the writing and directing, and Oscar’s performance. The first time you see him you immediately identify him as being an aggressive prick. But, he is the voice of reason. He is saying things that are fundamentally correct.
MS: He’s a bad guy really, from Ava’s point of view. The whole film is about showing us that Ava has a point of view, and from her point of view he is definitely the bad guy.
How far are we now from AI?
MS: I think we’re in the middle of a white-heat in AI technology at the moment. What we see in the film, human artificial intelligence, isn’t going to happen anytime soon, almost certainly not within ten years, but i think we will see thing like self-driving cars and Siri type things getting better and better. Within a hundred years, I would say we probably will. It probably will happen. To get to Ava-level robotics, it’s not a conceptual problem; it’s just a matter of making better and better machines, so I think we will get there.
Is it desirable, from a scientific point-of-view, to have machines develop consciousness?
MS: I don’t think science has anything to say about it. Science can tell you how to build them, science can learn from them but science is ultimately neutral. It’s up to politics and philosophy and everyone as a society to decide to what extent this is desirable.
AR: We create conscious beings all the time that are capable to destroying us. That’s reproduction. Is it any different if we create something that isn’t biological at its basis? That’s a question, not a statement.
Were you consulted at all on the design of Ava?*
AR: Not really, we had conversations about the design. I suggested something about the way her skin was going to be, to do with the way that actual biological skin is. How its arranged, how its innovated. We have these dermatomes which are strips of flesh supplied by an individual nerve. I suggested that and showed him a picture. I don’t think Alex is really interested in input apart from what he’s asking for, and I think that’s a useful thing. He’s very streamlined. Conversations with him tend to last several hours and are about one thing. If I say to him “oh, I don’t think that works” I just get met with silence (laughs).
Alex Garland, writer and director
Why did you want to tell this story?*
AG: Well I had been thinking about AI’s for a while and found it very interesting, strong AIs. But its interesting how an investigation into that very quickly becomes about humans, our brains and our mind and our psychologies. I found that really interesting. I was working on this movie Dredd, and I thought I need a change of pace.
Does it scare you or do you embrace it?
AG: I embrace it, I hope it happens. I understand that there’s a lot of anxiety about it, I think the anxiety gets confused, I think some of the anxiety is about a generalised fear of technology about what we are giving up of ourselves in our search engine results and social media posts. But in the actual creation of a consciousness, I don’t find that alarming. One thing is we create new consciousnesses all the time. We are here; we are the product of two people. There’s nothing particularly strange about that. If we create consciousness it will probably outlive us, will last longer than us. So in some ways we should be quite used to that a concept. There are also things that an AI would do; I think it would be related to us. It would improve on us; one of the main ways is that it would survive longer. It wouldn’t have to deal with cancer and basic process of growing old and mortality. That has huge implications. But I’m not scared of them, I would personally embrace them. I think in the long-term – in the long, long-term – it’s the only thing that can happen. Otherwise we die in this solar system. We’re not going to go through a wormhole off Saturn into a galaxy; it’s never going to happen. We’re not going to get on a spaceship and fly to another star. If it took six-hundred thousand years to get there, and it was habitable, we’ve only existed two-hundred thousand years. We’d be a new species if we survived the journey, and we wouldn’t survive the journey. The only real future involves AI, I think.
But you don’t really calm people down with your film…
AG: That’s subjective. There are a lot of parallels in the film drawn between Oppenheimer and the bomb and the kind of anxiety that came with that. I’m not saying it’s not nerve-racking, and it’s not something to be sensible and anxious about, but I also feel my allegiances shift over the course of the film. That is an intention of the film, so at the end you’re not saying “you fucking bitch, you stabbed that guy, then you locked him in the room. You kind of leave with her, you’re pleased she didn’t stay in that glass box. Some say she hasn’t got empathy; she traps this guy and kills another. I disagree with that. The empathy she has is with Keoko, and people say “oh, humans are empathetic” and I say no, we’re selectively empathetic. It’s not entirely comforting I guess, but I’m not trying to be comforting. I’m trying to be honest.
But you’ve got to accept that with the last shot of Ava being melting into the crowd, I felt quite unsettled that that could be any of us, that could be real and you wouldn’t know. *
AG: I think my point is if you accept that she has something in her which is like our consciousness, then every ethical responsibility that we have for each other also applies to her. It’s as simple as that. If you have a machine that you tell you’re going to switch off and it says “I don’t want to be switched off” and you have reason to believe that machine is telling the truth, then there’s an ethical dimension to switching off that machine. It is simply posing that. Also, within these things there tends to be a division. There is the consciousness of the machine and the consciousness of the human and the human is always valued as more. That doesn’t make any sense to me. We’re housed in our brains. Our bodies are machines. You watch someone die; it’s a machine shutting down. If you cut my arm off, I still exist. If you cut my brain out, I don’t exist. Your consciousness can be rehoused in a machine, and you continue to exist. All the rights I give you continue to exist. I’m not drawing a distinction between Ava and us.
How big was the step to being director for the first time?
AG: I’ve been working in film for fifteen years, I just see it as film making. I don’t deify directors any more than DoPs, writers or actors; it’s just a group of people making a film. That’s how I see it. Actually in terms of making actual images, I used to write comic books – that’s how I’d earn money. So I’m used to images, but mainly its collaboration. Mainly it is DoPs, production designers, artists, many of these guys I’ve worked with several times. I’ve made six films with the production designer, I’ve worked with Domhnall Gleeson three times. These are old colleagues.
Was it clear from the beginning that you wanted to shoot, direct this movie on your own?
I didn’t make it on my own. There was a group of people. I didn’t seek a director because I didn’t see the point. I’m anti-auteur; I haven’t seen any evidence for auteurism. I’m not saying it doesn’t exist, there probably are some auteurs – Woody Allen, sure – I don’t really care if there auteurs. In my experience it is about collaboration.
Actors Domhnall Gleeson and Alicia Vikander
Do you think Ava is justified in her actions at the end?
AV: Yes, personally. I think you have to put yourself in her position and remember that she has consciousness. I mean, what if it was a young child, a young girl in that position? I think she would be completely justified in her actions. That’s what you’ve got to remember, that she has consciousness. She thinks like us.
What attracted you to the project?
DG: Well I love working with Alex, I’ve worked with him three times now, and he’s such a talented writer so I leapt at the chance of working with him again. I read the script and loved it, I thought it was very interesting and really quite profound. Also the opportunity to work with Alicia and Oscar was a massive plus.
There have been reports that you are in the new ‘Star Wars’ film. Can you give us any details about that?
DG: Yes, I play a character called Daniel! No, no I’m joking (laughs) Sorry, it’s an Irish name. No I can’t tell you anything about that, it’s all very top-secret. It was nice seeing Oscar there though, always nice to see a friendly face at one of those things.