The big issue facing international warfare, aside from a potential nuclear war between the US and North Korea, is autonomous weapons.
Often referred to as killer robots, these are weapons systems that can select and fire at targets without human interference.
As part of the United Nations Convention on Conventional Weapons taking place in Geneva this week, governmental representatives and academics will meet to discuss lethal autonomous weapons systems.
Campaign groups, such as the Campaign to Stop Killer Robots will be present to argue that these types of killer robots should be pre-emptively banned before they are official uses.
The calls to ban killer robots
At the moment, there are no autonomous weapons in action.
According to a research paper written by Chatham House researcher and director of humans and autonomy laboratory at Duke University, Mary Cummings, on artificial intelligence (AI) and the future of warfare, the robot-type devices used by the military are still often directly controlled by humans.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataFor instance, drones that are used by armies are more automatic than autonomous. Cummings said:
Most military [unmanned vehicles] are only slightly more sophisticated: they have some low-level autonomy that allows them to navigate, and in some cases, land, without human intervention, but almost all require significant human intervention to execute their missions.
However, what Campaign to Stop Killer Robots is aiming to do is to ensure that it never gets to the stage when autonomous devices are loaded with explosives. According to the group’s website:
The concern is that low-cost sensors and rapid advances in AI are making it increasingly possible to design weapon systems that would target and attack without human intervention. If this trend towards autonomy continues, the fear is that humans will start to fade out of the decision-making loop, first retaining only a limited oversight role and then no role at all.
What will be discussed at the meeting this week?
There are three main areas that will be discussed at this week’s meeting: emerging technologies, military effects and the legal and ethical dimensions of autonomous robots.
The Convention on Conventional Weapons will not decide to ban or regulate lethal autonomous weapons. Instead, it is an opportunity for different groups to come together and discuss the implications of such devices.
This will inform a meeting on 22 November, the annual Meeting of High Contracting Parties, where states will decide on how to proceed next.
Could there be an eventual ban on killer robots?
There have been calls for bans before. In 2015, the UK opposed a ban on killer robots, telling the Guardian:
At present, we do not see the need for a prohibition on the use of [autonomous weapons] as international humanitarian law already provides sufficient regulation for this area.
However, there appears to be more impetus this year for a ban. In August, 116 leaders in the AI industry, including Tesla’s chief executive Elon Musk, posted an open letter saying that the development of such robots poses a huge threat to humanity.
If you’re not concerned about AI safety, you should be. Vastly more risk than North Korea. pic.twitter.com/2z0tiid0lc
— Elon Musk (@elonmusk) 12 August 2017
When the AI industry is calling for the ban on such devices, it could go some way to informing future policy on the topic.