The UK is Funding Development of Fully-Autonomous Military Drones

It has emerged that the UK is funding the development of fully autonomous military drones designed to fight in combat,

Over the weekend, Drone Wars UK, a small London-based NGO published a report detailing the British government’s funding of research programs designed to help facilitate the development of fully-autonomous and armed military drones, designed for their own decision-making.

The British government has previously denied working to develop drones of this type. According to the report, the UK’s Ministry of Defence policy states that “the UK opposes the development of autonomous weapon systems and has no intention of developing them.”

The key point of definition is that the drones currently used by the British military are remotely-controlled by humans, either at Creech Air Force Base in Nevada; RAF Waddington in Lincolnshire; or at Larkhill on Salisbury Plain.

Yet this new report claims that the British government is funding research into drones that could fly, identify targets, and attack without direct human supervision or control. This is uncharted territory and a potential game-changer in the development of an AI military function.

Why is The UK Developing AI Military Drones?

There would be a wide range of military benefits to a fully-autonomous drone. Such drones could be constantly surveilling targets, for example, or could potentially help avoid the psychological impact for soldiers currently having to remotely kill from a distance.

However, the UK’s strategy for procuring drones up to this point has been to buy “off-the-shelf” models, such as the Desert Hawk 3 manufactured by Lockheed Martin, or General Atomic’s MQ-9 Reaper. The decision to start funding research into AI and machine-learning for combat missions represents a new turn for the UK military.

General Atomics MQ-9 Reaper Medium

The General Atomics MQ-9 Reaper currently in use by the British military (though this one has US insignia)

According to the UK’s Industrial Strategy, published in November 2017, developing artificial intelligence capabilities is one of the four “Grand Challenges” facing the UK in the coming years. The current government wants to put the UK “at the forefront of the artificial intelligence and data revolution.” This, naturally, requires funding. And the UK government is prepared to throw serious money at the challenge, setting aside £500 million to facilitate developments in Britain’s AI capability.

Could this new investment in home-grown AI tech be designed to create new autonomous military aircraft for Britain to sell? It certainly seems possible. Selling arms is already a big money-spinner for the UK economy — according to data from the Campaign Against the Arms Trade, the value of British weapons exports between June 2017-18 totalled nearly £8.7 billion (roughly $11.3 billion). And, being the first to offer a fully-independent flying killing machine would give it a market edge.

Should We Be Scared?

Yes, regardless of whether you live in a warzone or in the safety of western suburbia. The potential for these drones to go wrong, be compromised for misuse, or simply creep into everyday life is frightening.

Despite hopes that the advent of drone warfare would permit a cleaner, more precise form of fighting, particularly when searching for high-value individuals in the global war on terror, the kind of precision promised doesn’t seem evident from the facts on the ground.

In fact, as Drone Wars UK’s report points out, there are several inherent flaws with AI in its current state that make it unsuitable for combat:

Unpredictable behavior

“Unpredictability is an inherent feature of artificial intelligence and self-learning computer systems, because we do not necessarily understand how they work. Decision-making by such systems is derived largely from processes which have evolved, rather than been programmed by humans. Decisions will therefore be based on opaque logic, lacking a clear chain of analysis, and it may not be possible for humans to validate them… This raises real problems, because it may not be possible to fully trust the outputs of an autonomous system which relies on machine learning systems.

Loss of command and control

“Even though lethal autonomous drones of the future would be able to operate largely independently, there would still need to be some degree of human command and control to ensure they remained within their operating parameters. Loss of control… might occur as a result of a loss of communication, or, more seriously, as a result of human intervention through jamming, hacking, or spoofing the system. A successful cyber attack on an autonomous weapon system could be potentially very serious.”

Sound familiar?

‘Normal’ accidents

“Complex systems… are intrinsically vulnerable to ‘normal accidents.’ During the Iraq War in 2003 the US Army’s Patriot air defence system, a complex and highly automated weapon system was involved in two ‘friendly fire’ incidents which can be explained by normal accident theory… Like all normal accidents, the examples given here could be considered freak occurrences. But this is the nature of normal accidents, which are unpredictable and unavoidable. Human-machine learning, often cited as a strategy for reducing the risks associated with autonomous weapons, does not eliminate these risks and under some circumstances may add a further level of complexity and scope for error.”

Misuse

“Autonomous weapons systems and lethal autonomous drones may be designed with a range of safety features intended to ensure they can only be used for specific purposes… Nevertheless, experience suggests that eventually such a system will be misused for a purpose for which it was not intended… Autonomous weapons could commence in uncluttered environments but gradually ‘creep’ into more complex ones without taking into account the different situation and the potential of an increased risk of harm to non-combatants.”

What Does This Mean for AI Warfare?

Obviously, we’re on the cusp of a future in which wars are fought by warriors which don’t sleep, eat or have feelings, and could potentially operate without direct human oversight. Which should be a prospect scary enough for most.

However, it also means that the British government is quite happy to try to keep its efforts to develop weapons of this kind secret from the British public and the rest of the world. This, frankly, is even scarier. Without the report from Drones Watch UK, we might never have known about these plans.

It also leaves open the likelihood that other countries are actively investigating and developing their capabilities to produce fully-autonomous drones. Given that the UK isn’t even the world leader in high-tech military hardware, it’s easy to imagine that similar programs are underway in China, Israel and the US.

Will World War 3 be fought by robots? It seems likelier than ever.

Read more about autonomous tech and AI on Tech.Co:

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Tom Fogden is a writer for Tech.co with a range of experience in the world of tech publishing. Tom covers everything from cybersecurity, to social media, website builders, and point of sale software when he's not reviewing the latest phones.
Back to top