The term “Killer Robots” refers to fully autonomous lethal offensive weapons (such as drones) which choose targets without any human intervention. The main concern about such modern weapons, expressed by ‘human rights’ groups which oppose their use, is that machines, rather than humans, “could ultimately make life-or-death decisions on the battlefield or in law enforcement”.
On April 13th, The Guardian published a report by Legal Affairs correspondent Owen Bowcott titled ‘UK opposes international ban on developing ‘killer robots’.
The article begins thusly:
The UK is opposing an international ban on so-called “killer robots” at a United Nations conference that is this week examining future developments of what are officially termed lethal autonomous weapons systems (Laws).
Experts from the Foreign Office and the Ministry of Defence are participating in the week-long session in Geneva which will consider whether increased computing power will eventually enable drones and other machines to select targets and carry out attacks without direct human intervention.
The Campaign to Stop Killer Robots, an alliance of human rights groups and concerned scientists, is calling for an international prohibition on fully autonomous weapons.
Yet, note the photo The Guardian used to illustrate the story.
Do you see the problem?
The Guardian used a photo of Israel’s Iron Dome defensive missile system to illustrate an article about autonomous offensive weapons systems. Of course, the Iron Dome is a system which knocks down enemy rockets which target populated areas of the country. Plus, the Iron Dome system is managed by soldiers in a command center who decide, once they receive information on the incoming rocket’s trajectory, if a defensive missile is launched.
So, the Iron Dome system is neither offensive nor fully autonomous, and thus can’t be considered a “killer robot”.
The Guardian photo is inappropriate and extremely misleading.