21st Century Arms Race? Weaponised Artificial Intelligence
A global ban on developing artificially intelligent robot soldiers is needed to protect the future of mankind, a letter from hundreds of leading scientists including Stephen Hawking has warned.
The letter, presented at a major conference in Argentina, suggests the ability to create autonomous weapons which think for themselves, unlike remote-controlled drones, is "feasible within years".
DARPA's LS3 is a rough-terrain robot designed to go anywhere.
Apple co-founder Steve Wozniak and SpaceX entrepreneur Elon Musk were also among the signatories, who warned that developing autonomous weapons would be a "bad idea".
It said: "If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow.
Prof Hawking has previously said
"Autonomous weapons are ideal for tasks such as assassinations, destabilising nations, subduing populations and selectively killing a particular ethnic group."
"We therefore believe that a military AI arms race would not be beneficial for humanity."
AI could "make battlefields safer for humans", including civilians, because it could focus on specific targets set by pre-defined criteria programmed by humans, the organisation said.
The Terminator series was the first film franchise to bring artificial intelligence into the public eye:
But, while many chemists and biologists did not want to develop weapons in their field, the letter added, robotics researchers had "no interest in building AI weapons".
"We believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so. Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control," the scientists concluded.