Nuclear experts warn AI will power deadly weapons soon. No one fully knows what that means yet.
In mid-July, Nobel laureates met at the University of Chicago for closed talks with top nuclear war experts. The goal: help the laureates advise world leaders on avoiding nuclear disaster.
AI was a key focus. Stanford’s Scott Sagan said,
> “We’re entering a new world of artificial intelligence and emerging technologies influencing our daily life, but also influencing the nuclear world we live in.”
Retired US Air Force Major General Bob Latiff, who helps set the Doomsday Clock, added AI’s integration is inevitable.
> “It’s like electricity,” Latiff said. “It’s going to find its way into everything.”
But experts admit there’s confusion over what AI actually means in this context. Jon Wolfsthal of the Federation of American Scientists told reporters,
> “The conversation about AI and nukes is hampered by a couple of major problems. The first is that nobody really knows what AI is.”
Stanford professor Herb Lin criticized the debate getting hijacked by chatbots.
> “What does it mean to give AI control of a nuclear weapon? What does it mean to give a [computer chip] control of a nuclear weapon?” Lin said. “Part of the problem is that large language models have taken over the debate.”
Good news: No one thinks ChatGPT will hold nuclear codes anytime soon. Wolfsthal confirmed,
> “In this realm, almost everybody says we want effective human control over nuclear weapon decisionmaking.”
But whispers persist about AI tools helping leaders interpret what rivals like Putin or Xi might do, using massive datasets to predict outcomes.
Wolfsthal explained,
> “A number of people have said, ‘Well, look, all I want to do is have an interactive computer available for the president so he can figure out what Putin or Xi will do and I can produce that dataset very reliably. I can get everything that Xi or Putin has ever said and written about anything and have a statistically high probability to reflect what Putin has said.’”