This article is from the source 'bbc' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.bbc.co.uk/news/technology-47090174

The article has changed 4 times. There is an RSS feed of changes available.

Version 1 Version 2
Can you murder a robot? Can you murder a robot?
(about 5 hours later)
Back in 2015, a hitchhiker was murdered on the streets of Philadelphia.Back in 2015, a hitchhiker was murdered on the streets of Philadelphia.
It was no ordinary crime. The hitchhiker in question was a little robot called Hitchbot. The "death" raised an interesting question about human-robot relationship - not so much whether we can trust robots but whether the robots can trust us.It was no ordinary crime. The hitchhiker in question was a little robot called Hitchbot. The "death" raised an interesting question about human-robot relationship - not so much whether we can trust robots but whether the robots can trust us.
The answer, it seems, was no.The answer, it seems, was no.
Hitchbot has now been rebuilt, at Ryerson University, in Toronto, where it was conceived.Hitchbot has now been rebuilt, at Ryerson University, in Toronto, where it was conceived.
Its story is perhaps the ultimate tale of robot destruction, made all the more poignant by the fact that it was designed to be childlike and entirely non-threatening.Its story is perhaps the ultimate tale of robot destruction, made all the more poignant by the fact that it was designed to be childlike and entirely non-threatening.
With pool noodles for arms and legs, a transparent cake container for a head, a white bucket as a body, and resting on a child's car seat to allow anyone picking it up to be able to transport it safely, it was cartoon-like. If a child designed a robot, it would probably look like Hitchbot.With pool noodles for arms and legs, a transparent cake container for a head, a white bucket as a body, and resting on a child's car seat to allow anyone picking it up to be able to transport it safely, it was cartoon-like. If a child designed a robot, it would probably look like Hitchbot.
The team deliberately made it on the cheap - describing its look as "yard-sale chic". They were aware that it may come to harm.The team deliberately made it on the cheap - describing its look as "yard-sale chic". They were aware that it may come to harm.
In order to qualify as a robot, it had to have some basic electronics - including a Global Positioning System (GPS) receiver to track its journey, movements in its arms, and software to allow it to communicate when asked questions. It could also smile and wink.In order to qualify as a robot, it had to have some basic electronics - including a Global Positioning System (GPS) receiver to track its journey, movements in its arms, and software to allow it to communicate when asked questions. It could also smile and wink.
And, of course, it could move its thumb into a hitch position.And, of course, it could move its thumb into a hitch position.
"It was extremely important that people would trust it and want to help it out which is why we made it the size of a child," said Dr Frauke Zeller, who led the team with her husband, Prof David Smith."It was extremely important that people would trust it and want to help it out which is why we made it the size of a child," said Dr Frauke Zeller, who led the team with her husband, Prof David Smith.
The adventure started well, with Hitchbot being picked up by an elderly couple and taken on a camping trip in Halifax, Nova Scotia, followed by a sightseeing tour with a group of young men. Next, it was a guest of honour at a First Nation powwow, where it was given a name that translates to "Iron Woman", assigning it a gender. The adventure started well, with Hitchbot being picked up by an elderly couple and taken on a camping trip in Halifax, Nova Scotia, followed by a sightseeing tour with a group of young men. Next, it was a guest of honour at a First Nation powwow, where it was given a name that translates to "Iron Woman", assigning it a gender.
The robot picked up thousands of fans along the way, many travelling miles to be the next person to give it a lift.The robot picked up thousands of fans along the way, many travelling miles to be the next person to give it a lift.
Sometimes, the robot's GPS location had to be disabled so that those who took it home wouldn't be mobbed outside their houses.Sometimes, the robot's GPS location had to be disabled so that those who took it home wouldn't be mobbed outside their houses.
The robot certainly appealed and the team behind it were swamped with international press enquiries from the outset.The robot certainly appealed and the team behind it were swamped with international press enquiries from the outset.
Hitchbot was given its own social media accounts on Twitter, Facebook and Instagram and became an instant hit, gaining thousands of followers.Hitchbot was given its own social media accounts on Twitter, Facebook and Instagram and became an instant hit, gaining thousands of followers.
"People began to decorate Hitchbot with bracelets and other jewellery. This little robot with its simple design triggered so much creativity in people. And that was one of the biggest takeaways of the experiment, that we should stop telling people what to do with technology," Dr Zeller said."People began to decorate Hitchbot with bracelets and other jewellery. This little robot with its simple design triggered so much creativity in people. And that was one of the biggest takeaways of the experiment, that we should stop telling people what to do with technology," Dr Zeller said.
But Hitchbot's adventure was about to come to an abrupt end.But Hitchbot's adventure was about to come to an abrupt end.
"One day we received images of Hitchbot lying in the street with its arms and legs ripped off and its head missing," Dr Zeller said."One day we received images of Hitchbot lying in the street with its arms and legs ripped off and its head missing," Dr Zeller said.
"It effected thousands of people worldwide. Hitchbot had become an important symbol of trust. It was very sad and it hit us and the whole team more than I would have expected." "It affected thousands of people worldwide. Hitchbot had become an important symbol of trust. It was very sad and it hit us and the whole team more than I would have expected."
Now, the team have rebuilt Hitchbot, even though its head was never found. They missed having it around and had been inundated with requests for Hitchbot 2.0, although they have no plans for another road trip.Now, the team have rebuilt Hitchbot, even though its head was never found. They missed having it around and had been inundated with requests for Hitchbot 2.0, although they have no plans for another road trip.
BBC News joined Prof Smith and Dr Zeller to take Hitchbot 2.0 on one of its first outings, to the safety of a cafe next to the university. The robot was instantly recognised by passers-by, many of whom stopped to chat and take a Hitchbot selfie. All of them seemed overjoyed to see the robot back in one piece.BBC News joined Prof Smith and Dr Zeller to take Hitchbot 2.0 on one of its first outings, to the safety of a cafe next to the university. The robot was instantly recognised by passers-by, many of whom stopped to chat and take a Hitchbot selfie. All of them seemed overjoyed to see the robot back in one piece.
The Ryerson team is also working with Softbank's Pepper, an archetypal big-eyed childlike robot, on another test of the trust relationship with humans. Pepper will be used to talk with patients about cancer care. The theory is that patients will communicate more openly with Pepper than they would to a human carer.The Ryerson team is also working with Softbank's Pepper, an archetypal big-eyed childlike robot, on another test of the trust relationship with humans. Pepper will be used to talk with patients about cancer care. The theory is that patients will communicate more openly with Pepper than they would to a human carer.
Beating up botsBeating up bots
Hitchbot is not the first robot to meet a violent end.Hitchbot is not the first robot to meet a violent end.
Dr Kate Darling, of Massachusetts Institute of Technology (MIT), encouraged people to hit dinosaur robots with a mallet, in an workshop designed to test just how nasty we could be to a machine.Dr Kate Darling, of Massachusetts Institute of Technology (MIT), encouraged people to hit dinosaur robots with a mallet, in an workshop designed to test just how nasty we could be to a machine.
She also conducted an experiment with small bug-like robots.She also conducted an experiment with small bug-like robots.
Most people struggled to hurt the bots, found Dr Darling.Most people struggled to hurt the bots, found Dr Darling.
"There was a correlation between how empathetic people were and how long it took them to hit a robot," she told BBC News, at her lab in Boston."There was a correlation between how empathetic people were and how long it took them to hit a robot," she told BBC News, at her lab in Boston.
"What does it say about you as a person if you are willing to be cruel to a robot. Is it morally disturbing to beat up something that reacts in a very lifelike way?" she asked."What does it say about you as a person if you are willing to be cruel to a robot. Is it morally disturbing to beat up something that reacts in a very lifelike way?" she asked.
The reaction of most people was to protect and care for the robots.The reaction of most people was to protect and care for the robots.
"One woman was so distressed that she removed the robot's batteries so that it couldn't feel pain," Dr Darling said."One woman was so distressed that she removed the robot's batteries so that it couldn't feel pain," Dr Darling said.
Prof Rosalind Picard, who heads up the Affective Computing Lab, also based at the Massachusetts Institute of Technology, thinks it comes down to human nature.Prof Rosalind Picard, who heads up the Affective Computing Lab, also based at the Massachusetts Institute of Technology, thinks it comes down to human nature.
"We are made for relationships, even us engineers, and that is such a powerful thing that we fit machines into that," she said."We are made for relationships, even us engineers, and that is such a powerful thing that we fit machines into that," she said.
But while it is important that robots understand human emotions because it will be their job to serve us, it might not be a good idea to anthropomorphise the machines.But while it is important that robots understand human emotions because it will be their job to serve us, it might not be a good idea to anthropomorphise the machines.
"We are at a pivotal point where we can choose as a society that we are not going to mislead people into thinking these machines are more human than they are," Prof Picard told BBC News, at her lab."We are at a pivotal point where we can choose as a society that we are not going to mislead people into thinking these machines are more human than they are," Prof Picard told BBC News, at her lab.
"We know that these machines are nowhere near the capabilities of humans. They can fake it for the moment of an interview and they can look lifelike and say the right thing in particular situations.""We know that these machines are nowhere near the capabilities of humans. They can fake it for the moment of an interview and they can look lifelike and say the right thing in particular situations."
"A robot can be shown a picture of a face that is smiling but it doesn't know what it feels like to be happy."A robot can be shown a picture of a face that is smiling but it doesn't know what it feels like to be happy.
"It can be given examples of situations that make people smile but it doesn't understand that it might be a smile of pain.""It can be given examples of situations that make people smile but it doesn't understand that it might be a smile of pain."
But Prof Picard admitted it was hard not to develop feelings for the machines we surrounded ourselves with and confessed that even she had fallen into that trap, treating her first car "as if it had a personality".But Prof Picard admitted it was hard not to develop feelings for the machines we surrounded ourselves with and confessed that even she had fallen into that trap, treating her first car "as if it had a personality".
"I blinked back a tear when I sold it, which was ridiculous," she said."I blinked back a tear when I sold it, which was ridiculous," she said.
At her lab, engineers design robots that can help humans but do not necessarily look human.At her lab, engineers design robots that can help humans but do not necessarily look human.
One project is looking at robots that could work in hospitals as a companion to children when their parents or a nurse is not available. And they are working on a robot that will be able to teach children but also show them how to cope with not knowing things.One project is looking at robots that could work in hospitals as a companion to children when their parents or a nurse is not available. And they are working on a robot that will be able to teach children but also show them how to cope with not knowing things.
We may have to limit our emotional response to robots but it is important that the robots understand ours, according to Prof Picard.We may have to limit our emotional response to robots but it is important that the robots understand ours, according to Prof Picard.
"If the robot does something that annoys you, then the machine should see that you are irritated and - like your dog - do the equivalent of putting down its tail, put its ears back and look like it made a mistake," she said."If the robot does something that annoys you, then the machine should see that you are irritated and - like your dog - do the equivalent of putting down its tail, put its ears back and look like it made a mistake," she said.
Killer robotsKiller robots
Roboticist Prof Noel Sharkey also thinks that we need to get over our obsession with treating machines as if they were human.Roboticist Prof Noel Sharkey also thinks that we need to get over our obsession with treating machines as if they were human.
"People perceive robots as something between an animate and an inanimate object and it has to do with our in-built anthropomorphism," he told BBC News."People perceive robots as something between an animate and an inanimate object and it has to do with our in-built anthropomorphism," he told BBC News.
"If objects move in a certain way, we think that they are thinking."If objects move in a certain way, we think that they are thinking.
"What I try and do is stop people using these dumb analogies and human words for everything."What I try and do is stop people using these dumb analogies and human words for everything.
"It is about time we developed our own scientific language.""It is about time we developed our own scientific language."
To prove his point, at one conference he attended recently he picked up an extremely cute robotic seal, designed for elderly care, and started banging its head against a table.To prove his point, at one conference he attended recently he picked up an extremely cute robotic seal, designed for elderly care, and started banging its head against a table.
"People were calling me a monster," he said."People were calling me a monster," he said.
Actually, Prof Sharkey is much more of a pacifist - and leads the campaign to ban killer robots, something he thinks is a far more pressing ethical issue in modern-day robotics.Actually, Prof Sharkey is much more of a pacifist - and leads the campaign to ban killer robots, something he thinks is a far more pressing ethical issue in modern-day robotics.
"These are not human-looking robots," he said."These are not human-looking robots," he said.
"I'm not talking about Terminators with a machine gun."I'm not talking about Terminators with a machine gun.
"These weapons look like conventional weapons but are designed so that the machine selects its own target, which to me is against human dignity.""These weapons look like conventional weapons but are designed so that the machine selects its own target, which to me is against human dignity."
Prof Sharkey listed some of the current projects he thought were crossing the line into unethical territory:Prof Sharkey listed some of the current projects he thought were crossing the line into unethical territory:
And he has been working at the UN for the past five years to get a new international treaty signed that either bans the use of them or states that they can never be used without "meaningful human control" - 26 nations are currently signed up, including China.And he has been working at the UN for the past five years to get a new international treaty signed that either bans the use of them or states that they can never be used without "meaningful human control" - 26 nations are currently signed up, including China.
Listen to more on this story: Can you murder a robot? The Documentary, BBC World Service, airing 17 MarchListen to more on this story: Can you murder a robot? The Documentary, BBC World Service, airing 17 March