Christopher Nolan wants the AI industry to learn from his film Oppenheimer

0
268

The director wants the AI industry in particular to learn from his film.

J. Robert Oppenheimer is commonly referred to as the father of the atomic bomb . To pre-empt the Nazis, the American of German-Jewish descent played a major role in the Manhattan Project and eventually led it.

The goal: to invent and detonate the world’s first atomic bomb, called The Gadget

After Oppenheimer learned that the American military had dropped an atomic bomb on Hiroshima, he realised the full extent of his research, which he eventually came to regret very much.

In an interview in 1965, he quoted the Bhagavad Gita, a sacred scripture of Hinduism, with a phrase that would inevitably be forever associated with him:

Now I have become Death, the destroyer of worlds.

Christopher Nolan hopes Silicon Valley learns from his film

On 15 July 2023, Oppenheimer was screened to journalists in the US, with a Q&A session afterwards, as reported by The Verge  This raised the question of what Silicon Valley should learn from his film.

I hope they take away from the film the concept of responsibility.

Nolan continues: If you innovate with the help of technology, you also have to make sure that there is accountability.

The director alludes to the companies that launch technological developments at the drop of a hat while refusing to acknowledge the damage it may have caused.

The increase in companies over the last 15 years throwing around words like “algorithm” without knowing what they mean in any meaningful, mathematical sense. They just don’t want to take responsibility for what the algorithm does.

Christopher Nolan, director of Oppenheimer

This statement takes on a special flavour when you consider that last week Metas launched Threads, a new short messaging service that recommends content based solely on its algorithm.

What does this mean in the wake of AI?

This is what Nolan finally came to on his own and has a clear statement:

Applied to AI, terrible possibilities open up. Terrible.

Continues to perform:

Not least because AI systems, when integrated into defence infrastructure, will eventually be held responsible for nuclear weapons. If we allow people to say this is a different entity than the person operating, programming and deploying the AI, then we are doomed.

Christopher Nolan, director of Oppenheimer

Nolan also urges responsibility here. You can’t blame an AI, he says, when there are humans behind that AI who programmed or operate it

The director does talk about AI systems, but picks up on his previous statement about algorithms. He doesn’t name any specific companies, but Google or Netflix rely heavily on their algorithm to grow and satisfy their audiences.

A strong negative example that backs up Nolan’s claims is the influence of Meta, respectively its algorithm, and finally the support of genocide in Myanmar.

Light at the end of the tunnel

The picture Nolan paints is a bleak one, but he has hope.

When I talk to the leading researchers in the field of AI, they literally refer to this moment as their Oppenheimer moment. They look at its history, become aware of their responsibility when developing new technologies that can have unintended consequences.

Christopher Nolan, director of Oppenheimer

After Nolan gave this answer to the journalists, one of the attendees asked: Do you think that’s how they think in Silicon Valley right now?

Nolan’s answer:

At least they say they do – and that’s helpful. At least it’s in the conversation. And I hope that this thought process will continue. I’m not suggesting that Oppenheimer’s story offers easy answers to these questions, but at least the film serves as a cautionary tale.

Christopher Nolan, director of Oppenheimer

Christopher Nolan’s Oppenheimer opens in this country on 20 July. Then you can see for yourselves. Our colleagues at Filmstarts tell you why Oppenheimer was allowed to be exactly 3 hours long and not a second longer.

Christopher Nolan addresses important ethical issues regarding technology in the context of his new film. What do you think of the director’s statements? Is he painting the devil on the wall or should we all be more responsible when dealing with technology and especially AI? Feel free to write it in the comments.