Consumer group demands Google answer robot-car questions

As federal officials prepare to hold a public hearing Friday on self-driving cars, a consumer-advocacy group is demanding that Google explain how safe – and unsafe – its self-driving cars are.

The National Highway Traffic Safety Administration’s public meeting in Washington, D.C. is intended to help the agency develop guidelines for driverless cars. Another meeting will be held in California this spring, but the date and location haven’t been announced yet.

Google last month asked U.S. senators to fast-track regulations that would allow on the nation’s roads self-driving cars with no brake pedals or steering wheels. The traffic safety administration said last month that U.S. Transportation Department research had concluded that “there are few existing federal regulatory hurdles to deployment of automated vehicles with traditional designs and equipment to accommodate a human driver . . . [but] there may be greater obstacles to vehicle designs without controls for human drivers, such as a steering wheel or brake pedals.”

Consumer Watchdog, a vocal opponent of rapid deployment of completely autonomous cars, on Thursday issued a list of questions it wanted federal officials to require Google to answer.

“At the same time that Google wants to blow past federal safety requirements, the company has refused to provide detailed information that would enable the public, the press and policymakers to assess the safety and security of its autonomous cars,” Consumer Watchdog founder Harvey Rosenfield said in a statement.

Consumer Watchdog wants Google to publish a list of all the “real-life situations the cars cannot yet understand” along with Google’s plan for addressing those. It also wants the company to explain what will happen if a robot driver goes offline in a car carrying a passenger but having no steering wheel or pedals.

The group also demanded that Google publish all video and technical data associated with accidents and other “anomalous situations” involving its driverless cars, and that it publish all its data on the vehicles’ safety, as well as its software algorithms for the robo-cars.

The double-barreled question No. 6 is probably the most loaded of the 10: “Do you expect one of your robot cars to be involved in a fatal crash? If your robot car causes the crash, how would you be held accountable?”

A Google spokesperson noted in a statement that more than 33,000 people die every year on U.S. roads. “We believe that self-driving cars can make a difference,” the spokesperson said. “Self-driving cars are still in a phase of rapid development but today’s rules were written with yesterday’s cars in mind.

“That’s why we want to work with policymakers to ensure that this technology can be deployed when they’re proven to provide an equal or higher level of safety than existing standards.”

Photo: A Google driverless car navigating along a street in Mountain View in 2014 (AP Photo/Google)



Tags: , , , , , , ,


Share this Post

  • Robb49

    The biggest risk is that the self driving car obeys the rules and human drivers often don’t. So, they aren’t going to drive faster than the speed limit when everybody else does and they aren’t going to drive recklessly and aggressively in heavy traffic like other drivers do. Because the driver-less cars will obey the letter of the law, it’s going to make bad drivers impatient. More likely than not, accidents will be the other driver’s fault.

    • cw

      Actually, if you watch the SXSW talk, you’ll find that the Google car does try to follow social norms, not the letter of the law. The specific example mentioned was crossing a double yellow line to pass a double-parked car, but I suspect this might apply elsewhere as well.

      • Robb49

        Perchance a car does make judgement calls based on social norms instead of the letter of the law, that indeed raises issues about liability. People are cited and sued all of the time for their misinterpretations or “bending” of the law. You can’t sue a car for doing that. Maybe, the owner. But, the owner could claim the car as semi-sentient and acting outside of their control. So, I hope the car at least has manners or enough trunk space to hide a body.