UK should campaign for international ban on autonomous killer drones

http://www.theguardian.com/world/2014/oct/22/uk-campaign-international-ban-autonomous-drones-ex-gchq-chief

Version 0 of 1.

Britain should lead the campaign for an international ban on development of autonomous “killer robots” but existing armed drone technology poses no “convincing ethical” problems, according to a policy commission headed by a former director of GCHQ.

The University of Birmingham review headed by Sir David Omand, who was also the UK’s first security and intelligence coordinator, argues that under legal regulation unmanned aerial vehicles (UAVs) provide significant military and civilian benefits.

The report, The Security Impact of Drones: Challenges and Opportunities for the UK, draws together expertise from leading lawyers, manufacturers and military experts to coordinate policy in the face of global proliferation of drones.

The study dismisses fears that “the threshold for the use of force will be lowered by the availability of RPAs (Remotely Piloted Aircraft) to UK Armed Forces” but cautions that it depends on “parliament playing its proper oversight function”.

Officials need to be careful, the commission warns, that intelligence cooperation with the US military – which conducts regular drone strikes in Yemen, Somalia and Pakistan – does not involve British troops or officials in illegal activity.

“The government should confirm that guidance has been issued to staff, and safeguards put in place, to ensure that in sharing intelligence with the US government and military, the UK government does not inadvertently collude in RPA or other counter-terrorist actions contrary to international law,” the report recommends.

But fully autonomous drones that select their own targets should be opposed, the report by the University of Birmingham’s Institute for Conflict, Cooperation and Security argues. “[The] challenge is to deal with the fears of some that the inevitable development of more advanced RPA will eventually lead to ‘killer robots’, fielding Lethal Autonomous Weapons Systems (LAWS) that make their own targeting and weapon release decisions and thus do away with the need for a pilot on the ground,” it says.

“For a weapon system to be developed and used legally in armed conflict, it has to be acceptable under international humanitarian law. We support work to automate many of the sub-systems, such as navigation, that support the RPA. But we doubt it will ever be possible to programme autonomous air systems to be able to exercise distinction between legitimate and illegitimate targets.

“We are not persuaded that it will ever be possible to programme the laws of war into a ‘killer robot’... We fear not all actors will be as prudent, and we would like therefore to see the UK government take a leading role in discussions to build an international consensus around a set of norms to regulate, if not ban, LAWS.”

Those operating the existing generation of drones should be “uniformed, military personnel”, the report maintains, with “the appropriate ethical and technical training, and the requisite educational level and maturity.”

The report adds: “There is no convincing general ethical objection to acquiring RPA, whether armed or unarmed, while the ethical acceptability of their use, like that of other weapon systems, is contextually dependent upon meeting the legal principles of distinction and proportionality.”

Sir David Omand said: “For too long drone technology has carried a burden of ethical suspicion given its controversial use for counter-terrorist strikes by the US. The recent decision to deploy RAF Reaper to Iraq is a welcome sign in line with our findings of the growing acceptance of RPA technology as an essential component of modern military capability – provided it is used strictly in accordance with international law, in the same way as for other UK weapons systems. 

“RPA add precision targeting capabilities and long loiter times that can minimise civilian losses and protect friendly troops. We need not fear that their use by the UK Armed Forces represents a shift in the ethical framework of modern warfare.”

The policy commission suggest three obstacles need to be overcome: improving “public understanding and acceptance of the legal and ethical soundness of the practice”; allaying fears over the potential development of autonomous killer robots; and safeguarding British airspace and the privacy of British citizens if drones are to be increasingly used for domestic surveillance and security.

On use of drones for domestic surveillance, the report states: “Issues of safety and security of airspace and regulation of domestic RPA have to be resolved first. Before police and media surveillance RPA become common in our skies, as we believe they will, the government needs to have consulted the public and established appropriate codes of conduct to safeguard the privacy of the citizen.” The Civil Aviation Authority controls the regulatory regime.

Jennifer Gibson, a staff attorney at the legal charity Reprieve who also sat on the Birmingham commission, said: “When figures such as the former head of GCHQ are suggesting Britain needs to distance itself from the US drone programme, the UK government needs to listen.

“There can no longer be any doubt that covert US drone strikes in Pakistan and Yemen contravene international law. As long as the UK continues to support this programme – through the sharing of intelligence, air bases and personnel – it is complicit in these illegal drone strikes carried out by the CIA. British ministers must now come clean on how far this involvement has gone, and publicly disclose the safeguards that are in place to ensure it is not allowed to continue.”

Other commissioners who worked on the report included: Sir Brian Burridge, of the drone manufacturer Finmeccanica UK; Professor Keith Hayward of the Royal Aeronautical Society; Lt Gen Sir Paul Newton; Sir David Veness, professor of terrorism studies, University of St Andrews and Elizabeth Wilmshurst, an international law expert at Chatham House.