Advertising
  • News
  • Defence
  • Manufacturers & Airframes
  • DOD official: World faces ‘Terminator conundrum’ on AI weapons

DOD official: World faces ‘Terminator conundrum’ on AI weapons

The US military faces a “Terminator conundrum” when it comes to artificially intelligent killing machines, including armed UAVs and loitering munitions.

The Department of Defense (DOD) does not field such weapons today, says one senior military official – but the technology is nevertheless close at hand, and other nations might press forward.

Speaking at the Brookings Institution in Washington DC today, vice chairman of the joint chiefs of staff Gen Paul Selva says there should be a national and global debate on “AI” weapons for air, land, sea and undersea combat.

“We have proven that we can build and field unmanned underwater vehicles, unmanned surface vessels, unmanned wheeled vehicles, and remotely piloted air vehicles,” he says. “We can actually build ‘autonomous’ vehicles in every one of those categories.

“That gets us to the cusp of a question about whether or not we are willing to have unmanned autonomous systems that can launch on an enemy. What happens when that thing can inflict mortal harm and is empowered by artificial intelligence?”

Asset Image

Northrop Grumman X-47B is a proposed carrier-based UAV that would be armed and largely autonomous, but the human operator would make any decision to employ weapons.

US Navy

Selva says the technology is already here, with rudimentary AI systems monitoring day-to-day bank transactions and mining large volumes of data. But there are ethical, political and laws-of-war questions that must be answered before these types of weapons enter combat, he says.

“I call it ‘the Terminator conundrum’,” he says. “That’s a debate we need to have, I would argue nationally and internationally, to answer if we as humans want to cross that line.”

The US government has initiated and then cancelled several UAV and missile programmes that would have autonomously identified and destroyed targets based on “hard-coded” decision metrics.

“They are robotic, but not intelligent. There is a significant difference,” Selva says, adding that true AI machines could study targets and track them, but the final decision to launch weapons should remain with humans. “That’s about as far as I’m willing to go at this point,” he says.

High-profile technologists such as Elon Musk and Stephen Hawking have come out against AI weapons, saying they could spell disaster for humanity.

Concerns about armed robots entered popular culture with The Terminator movie in 1984, but became a more pressing worry when the General Atomics Aeronautical Systems MQ-1 Predator UAV was armed with the AGM-114 Hellfire missile in 2002.

Asset Image

US Air Force

Related Content
Advertising

Advertising