This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at http://www.theguardian.com/technology/2016/feb/29/google-self-driving-car-accident-california

The article has changed 4 times. There is an RSS feed of changes available.

Version 2 Version 3
Google self-driving car collides with bus in California, accident report says Google self-driving car collides with bus in California, accident report says
(6 months later)
One of Google’s self-driving cars has collided with a public bus in Mountain View, an accident report has revealed, in what appears to be the first example of one of the company’s prototype cars causing an accident.One of Google’s self-driving cars has collided with a public bus in Mountain View, an accident report has revealed, in what appears to be the first example of one of the company’s prototype cars causing an accident.
The incident happened on 14 February and was reported to California’s department of motor vehicles in an accident report that the agency posted on 29 February.The incident happened on 14 February and was reported to California’s department of motor vehicles in an accident report that the agency posted on 29 February.
The car was rolling at 2mph (3kph) and the bus at 15mph. No one was injured.The car was rolling at 2mph (3kph) and the bus at 15mph. No one was injured.
The report does not address fault. However, Google wrote that its car was trying to get around some sandbags on a street when its left front struck the right side of the bus.The report does not address fault. However, Google wrote that its car was trying to get around some sandbags on a street when its left front struck the right side of the bus.
The car’s test driver, who under state law must be in the front seat to grab the wheel when needed, thought the bus would yield and did not have control when the collision happened, according to Google’s report.The car’s test driver, who under state law must be in the front seat to grab the wheel when needed, thought the bus would yield and did not have control when the collision happened, according to Google’s report.
If it is determined the Google vehicle was at fault, it would be the first time one of its SUVs caused an accident while in autonomous mode.If it is determined the Google vehicle was at fault, it would be the first time one of its SUVs caused an accident while in autonomous mode.
Jessica Gonzalez, a spokeswoman from the DMV, said the agency hoped to speak with Google on Monday about what went wrong.Jessica Gonzalez, a spokeswoman from the DMV, said the agency hoped to speak with Google on Monday about what went wrong.
In a detailed statement from its monthly report on its self-driving car project, Google said that the incident happened on El Camino Real, a busy six-lane boulevard with hundreds on intersections. The car was following a recent change to its programming, following “the spirit of the road” as well as the traffic code by hugging the far side of the right-turn lane to allow other cars to pass on the left.In a detailed statement from its monthly report on its self-driving car project, Google said that the incident happened on El Camino Real, a busy six-lane boulevard with hundreds on intersections. The car was following a recent change to its programming, following “the spirit of the road” as well as the traffic code by hugging the far side of the right-turn lane to allow other cars to pass on the left.
“It then detected sandbags near a storm drain blocking its path, so it needed to come to a stop. After waiting for some other vehicles to pass, our vehicle, still in autonomous mode, began angling back toward the center of the lane at around 2mph – and made contact with the side of a passing bus traveling at 15mph. Our car had detected the approaching bus, but predicted that it would yield to us because we were ahead of it,” Google said.“It then detected sandbags near a storm drain blocking its path, so it needed to come to a stop. After waiting for some other vehicles to pass, our vehicle, still in autonomous mode, began angling back toward the center of the lane at around 2mph – and made contact with the side of a passing bus traveling at 15mph. Our car had detected the approaching bus, but predicted that it would yield to us because we were ahead of it,” Google said.
“Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.”“Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.”
This type of misunderstanding happens between human drivers on the road every day.This type of misunderstanding happens between human drivers on the road every day.
Google says it has refined its software following the incident, acknowledging that buses and other large vehicles are less likely to yield. “In this case we clearly bear some responsibility because if our car hadn’t moved there would’t have been a collision.”Google says it has refined its software following the incident, acknowledging that buses and other large vehicles are less likely to yield. “In this case we clearly bear some responsibility because if our car hadn’t moved there would’t have been a collision.”
“We hope to handle situations like this more gracefully in the future.”“We hope to handle situations like this more gracefully in the future.”
Hilary Rowen, a partner at the insurance regulation practice Sedgwick LLP and an expert in the issue of self-driving cars and legal responsibility, said the case is a good example of a conundrum that will soon be common.Hilary Rowen, a partner at the insurance regulation practice Sedgwick LLP and an expert in the issue of self-driving cars and legal responsibility, said the case is a good example of a conundrum that will soon be common.
“Here, the software didn’t avoid the accident, but the human could have taken over,” she said. “Who’s at fault – the driver, the bus driver, or the software?“Here, the software didn’t avoid the accident, but the human could have taken over,” she said. “Who’s at fault – the driver, the bus driver, or the software?
Rowen said in real world situations, both the driver and injured party will actually be incentivized to blame the software which, if found to be guilty, will leave the driver’s record clear and likely have a higher payout for the injured party.Rowen said in real world situations, both the driver and injured party will actually be incentivized to blame the software which, if found to be guilty, will leave the driver’s record clear and likely have a higher payout for the injured party.
“Everybody’s going to be blaming the software all the time,” Rowen said. “All the time.” Rowen still thinks autonomous car insurance will be cheaper than human-driven car insurance because humans aren’t very good drivers.“Everybody’s going to be blaming the software all the time,” Rowen said. “All the time.” Rowen still thinks autonomous car insurance will be cheaper than human-driven car insurance because humans aren’t very good drivers.
“At a very visceral level, people will accept a higher chance of being maimed or killed by a human being than they will by being maimed or killed by software,” she said. “The self-driving car will likely be able to make better risk calculations.”“At a very visceral level, people will accept a higher chance of being maimed or killed by a human being than they will by being maimed or killed by software,” she said. “The self-driving car will likely be able to make better risk calculations.”
Google has been testing two dozen Lexus SUVs outfitted with sensors and cameras near the tech firm’s Silicon Valley headquarters.Google has been testing two dozen Lexus SUVs outfitted with sensors and cameras near the tech firm’s Silicon Valley headquarters.
Google cars have been involved in more than a dozen collisions. In most cases, Google’s cars were rear-ended. No one has been seriously injured.Google cars have been involved in more than a dozen collisions. In most cases, Google’s cars were rear-ended. No one has been seriously injured.
Related: Google reports self-driving car mistakes: 272 failures and 13 near misses