A recent United Nations convention taking place in Geneva, Switzerland addressed the topic of autonomous weapons, termed “killer robots”, and how the international community may regulate them going forward. A number of field experts from different countries were brought together to talk over a universally agreed-upon definition for “killer robots” and to try and come to some sort of consensus about how they should be regulated, or if they should be banned outright. While the technologies that would make fully autonomous war machines possible are not quite there yet, AI has been advancing faster than international authorities and even country-level lawmakers can regulate it, necessitating speculative talks like this one.
The countries involved, according to meeting chair Amandeep Gill, tend to fall into one of three camps; some are pushing for a wholesale ban on the creation of autonomous weapons, others want some kind of agreement as to how the systems should be used if they’re made, and a third subset seemingly wants to simply allow technology to advance and see how it goes. Without a clear and overwhelming consensus, no international laws can be made at this point. According to Gill, however, the countries involved in the week-long summit are working toward a consensus on a general direction, along with some basic guidance as to how regulation should be approached. “…this is not an insignificant outcome,” he said of the proceedings. Nobel laureate Jody Williams said that 26 countries have thus far called for a total ban on autonomous weapons of war.
This ongoing convention is not the first attempt to address and regulate autonomous weapons on the world stage. Unlike the last international conference on the subject, however, this one did not feature an actual AI-driven robot as a speaker. Along with attention from the international leader community, AI weaponry has been thought of by citizens, and by tech companies and their employees. Google, for example, experienced a recent employee upheaval, with some going as far as quitting their jobs, in protest of the company’s involvement with Project Maven, a US Department of Defense contract meant to help drones identify people on the fly. Artificial intelligence, as a field, is moving in many different directions at once, and it’s hard to say what functions could come together to create an autonomous war machine or who may create or implement those functions. This means that the field is being approached with caution by most firms on the front lines of development.