This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.
You can find the current article at its original source at https://www.theguardian.com/science/2017/nov/13/ban-on-killer-robots-urgently-needed-say-scientists
The article has changed 5 times. There is an RSS feed of changes available.
Version 2 | Version 3 |
---|---|
Ban on killer robots urgently needed, say scientists | Ban on killer robots urgently needed, say scientists |
(about 17 hours later) | |
Technology now exists to create autonomous weapons that can select and kill human targets without supervision as UN urged to outlaw them | |
Ian Sample Science editor | |
Mon 13 Nov 2017 00.01 GMT | |
Last modified on Mon 27 Nov 2017 13.53 GMT | |
Share on Facebook | |
Share on Twitter | |
Share via Email | |
View more sharing options | |
Share on LinkedIn | |
Share on Pinterest | |
Share on Google+ | |
Share on WhatsApp | |
Share on Messenger | |
Close | |
The movie portrays a brutal future. A military firm unveils a tiny drone that hunts and kills with ruthless efficiency. But when the technology falls into the wrong hands, no one is safe. Politicians are cut down in broad daylight. The machines descend on a lecture hall and spot activists, who are swiftly dispatched with an explosive to the head. | The movie portrays a brutal future. A military firm unveils a tiny drone that hunts and kills with ruthless efficiency. But when the technology falls into the wrong hands, no one is safe. Politicians are cut down in broad daylight. The machines descend on a lecture hall and spot activists, who are swiftly dispatched with an explosive to the head. |
The short, disturbing film is the latest attempt by campaigners and concerned scientists to highlight the dangers of developing autonomous weapons that can find, track and fire on targets without human supervision. They warn that a preemptive ban on the technology is urgently needed to prevent terrible new weapons of mass destruction. | The short, disturbing film is the latest attempt by campaigners and concerned scientists to highlight the dangers of developing autonomous weapons that can find, track and fire on targets without human supervision. They warn that a preemptive ban on the technology is urgently needed to prevent terrible new weapons of mass destruction. |
Stuart Russell, a leading AI scientist at the University of California in Berkeley, and others will show the film on Monday during an event at the United Nations Convention on Conventional Weapons hosted by the Campaign to Stop Killer Robots. The manufacture and use of autonomous weapons, such as drones, tanks and automated machine guns, would be devastating for human security and freedom, and the window to halt their development is closing fast, Russell warned. | Stuart Russell, a leading AI scientist at the University of California in Berkeley, and others will show the film on Monday during an event at the United Nations Convention on Conventional Weapons hosted by the Campaign to Stop Killer Robots. The manufacture and use of autonomous weapons, such as drones, tanks and automated machine guns, would be devastating for human security and freedom, and the window to halt their development is closing fast, Russell warned. |
“The technology illustrated in the film is simply an integration of existing capabilities. It is not science fiction. In fact, it is easier to achieve than self-driving cars, which require far higher standards of performance,” Russell said. | “The technology illustrated in the film is simply an integration of existing capabilities. It is not science fiction. In fact, it is easier to achieve than self-driving cars, which require far higher standards of performance,” Russell said. |
The military has been one of the largest funders and adopters of artificial intelligence technology. The computing techniques help robots fly, navigate terrain, and patrol territories under the seas. Hooked up to a camera feed, image recognition algorithms can scan video footage for targets better than a human can. An automated sentry that guards South Korea’s border with the North draws on the technology to spot and track targets up to 4km away. | The military has been one of the largest funders and adopters of artificial intelligence technology. The computing techniques help robots fly, navigate terrain, and patrol territories under the seas. Hooked up to a camera feed, image recognition algorithms can scan video footage for targets better than a human can. An automated sentry that guards South Korea’s border with the North draws on the technology to spot and track targets up to 4km away. |
While military drones have long been flown remotely for surveillance and attacks, autonomous weapons armed with explosives and target recognition systems are now within reach and could locate and strike without deferring to a human controller. Opponents believe that handing machines the power over who lives and dies crosses a clear moral line. | While military drones have long been flown remotely for surveillance and attacks, autonomous weapons armed with explosives and target recognition systems are now within reach and could locate and strike without deferring to a human controller. Opponents believe that handing machines the power over who lives and dies crosses a clear moral line. |
“Pursuing the development of lethal autonomous weapons would drastically reduce international, national, local, and personal security,” Russell said. Scientists used a similar argument to convince presidents Lyndon Johnson and Richard Nixon to renounce the US biological weapons programme and ultimately bring about the Biological Weapons Convention. | “Pursuing the development of lethal autonomous weapons would drastically reduce international, national, local, and personal security,” Russell said. Scientists used a similar argument to convince presidents Lyndon Johnson and Richard Nixon to renounce the US biological weapons programme and ultimately bring about the Biological Weapons Convention. |
Because AI-powered machines are relatively cheap to manufacture, critics fear that autonomous weapons could be mass produced and fall into the hands of rogue nations or terrorists who could use them to suppress populations and wreak havoc, as the movie portrays. | Because AI-powered machines are relatively cheap to manufacture, critics fear that autonomous weapons could be mass produced and fall into the hands of rogue nations or terrorists who could use them to suppress populations and wreak havoc, as the movie portrays. |
A treaty banning autonomous weapons would prevent large-scale manufacturing of the technology. It would also provide a framework to police nations working on the technology, and the spread of dual-use devices and software such as quadcopters and target recognition algorithms. “Professional codes of ethics should also disallow the development of machines that can decide to kill a human,” Russell said. | A treaty banning autonomous weapons would prevent large-scale manufacturing of the technology. It would also provide a framework to police nations working on the technology, and the spread of dual-use devices and software such as quadcopters and target recognition algorithms. “Professional codes of ethics should also disallow the development of machines that can decide to kill a human,” Russell said. |
In August, more than 100 of the world’s leading robotics and AI pioneers called on the UN to ban the development and use of killer robots. The open letter, signed by Tesla’s chief executive, Elon Musk, and Mustafa Suleyman, the founder of Alphabet’s Deep Mind AI unit, warned that an urgent ban was needed to prevent a “third revolution in warfare”, after gunpowder and nuclear arms. So far, 19 countries have called for a ban, including Argentina, Egypt and Pakistan. | In August, more than 100 of the world’s leading robotics and AI pioneers called on the UN to ban the development and use of killer robots. The open letter, signed by Tesla’s chief executive, Elon Musk, and Mustafa Suleyman, the founder of Alphabet’s Deep Mind AI unit, warned that an urgent ban was needed to prevent a “third revolution in warfare”, after gunpowder and nuclear arms. So far, 19 countries have called for a ban, including Argentina, Egypt and Pakistan. |
Noel Sharkey, the emeritus professor of AI at Sheffield University and chair of the International Committee on Robot Arms Control, warned about the dangers of autonomous weapons 10 years ago. “The movie made my hair stand on end as it crystallises one possible futuristic outcome from the development of these hi-tech weapons,” he said. “There is an emerging arms race among the hi-tech nations to develop autonomous submarines, fighter jets, battleships and tanks that can find their own targets and apply violent force without the involvement of meaningful human decisions. It will only take one major war to unleash these new weapons with tragic humanitarian consequences and destabilisation of global security.” | Noel Sharkey, the emeritus professor of AI at Sheffield University and chair of the International Committee on Robot Arms Control, warned about the dangers of autonomous weapons 10 years ago. “The movie made my hair stand on end as it crystallises one possible futuristic outcome from the development of these hi-tech weapons,” he said. “There is an emerging arms race among the hi-tech nations to develop autonomous submarines, fighter jets, battleships and tanks that can find their own targets and apply violent force without the involvement of meaningful human decisions. It will only take one major war to unleash these new weapons with tragic humanitarian consequences and destabilisation of global security.” |
Criminals and activists have long relied on masks and disguises to hide their identities, but new computer vision techniques can essentially see through them. Earlier this year, Indian government-funded scientists worked with Cambridge University on an algorithm to identify people who obscured their faces with hats and sunglasses, fake beards and scarves. It remains a hard technical problem, but face recognition is only one way to identify people. “A balaclava does not hide one’s gender or age or ethnicity. And it could easily become the ‘norm’ that the weapons will also attack those deemed to be preventing identification or classification by covering up face and body,” Russell said. | Criminals and activists have long relied on masks and disguises to hide their identities, but new computer vision techniques can essentially see through them. Earlier this year, Indian government-funded scientists worked with Cambridge University on an algorithm to identify people who obscured their faces with hats and sunglasses, fake beards and scarves. It remains a hard technical problem, but face recognition is only one way to identify people. “A balaclava does not hide one’s gender or age or ethnicity. And it could easily become the ‘norm’ that the weapons will also attack those deemed to be preventing identification or classification by covering up face and body,” Russell said. |
In 2015, the UK government opposed an international ban on killer robots. The Foreign Office said it saw no need for the prohibition as international humanitarian law already regulated the area. “The UK is not developing lethal autonomous weapons systems, and the operation of weapons systems by the UK armed forces will always be under human oversight and control,” a Foreign Office spokesperson said at the time. | In 2015, the UK government opposed an international ban on killer robots. The Foreign Office said it saw no need for the prohibition as international humanitarian law already regulated the area. “The UK is not developing lethal autonomous weapons systems, and the operation of weapons systems by the UK armed forces will always be under human oversight and control,” a Foreign Office spokesperson said at the time. |
But according to the Campaign to Stop Killer Robots, a number of nations, including the US, China, Russia, Israel, South Korea, and the United Kingdom are moving toward systems that would give “greater combat autonomy” to machines. | But according to the Campaign to Stop Killer Robots, a number of nations, including the US, China, Russia, Israel, South Korea, and the United Kingdom are moving toward systems that would give “greater combat autonomy” to machines. |
“The UN moves at iceberg pace and actors with vested interests put obstacles in the path at every turn,” Sharkey said. “But the campaign continues to move forward with massive support from the scientific community. We must succeed because the alternatives are too horrifying.” | “The UN moves at iceberg pace and actors with vested interests put obstacles in the path at every turn,” Sharkey said. “But the campaign continues to move forward with massive support from the scientific community. We must succeed because the alternatives are too horrifying.” |
Science | |
Military | |
Robots | |
news | |
Share on Facebook | |
Share on Twitter | |
Share via Email | |
Share on LinkedIn | |
Share on Pinterest | |
Share on Google+ | |
Share on WhatsApp | |
Share on Messenger | |
Reuse this content |