August 22, 2017
A recent vote out of the United Nations has intitated formal discussions on AI weapons: Should tanks, drones, and machine guns operate using processes automated through artificial intelligence? Not according to inventor and business magnate Elon Musk, who is among a group of 116 experts calling for a ban on these types of AI weapons.
Musk and the rest of the specialists — from a total of 26 different countries — sent an open letter to the UN explaining their position. Here’s what you need to know.
It Could Be a “Third Revolution in Warfare”
The Guardian has a widely cited rundown of the background behind the open letter.
“In their letter, the founders warn the review conference of the convention on conventional weapons that this arms race threatens to usher in the ‘third revolution in warfare’ after gunpowder and nuclear arms,” The Guardian explains.
[…] Experts have previously warned that AI technology has reached a point where the deployment of autonomous weapons is feasible within years, rather than decades. While AI can be used to make the battlefield a safer place for military personnel, experts fear that offensive weapons that operate on their own would lower the threshold of going to battle and result in greater loss of human life.”
The problem with banning these lethal autonomous weapons is that, to some people, a third revolution in warfare isn’t necessarily a bad thing. It’s a Pandora’s box, as the letter itself says in a choice quote.
“These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.”
The UK government opposed a ban on AI weapons back in 2015, claiming international laws already offered “sufficient regulation.” This new letter might have an affect on current policy negotiations, but it’s fighting an uphill battle.
Did you like this article?
Get more delivered to your inbox just like it!
Sorry about that. Try these articles instead!