Safety groups say Congress has given self-driving companies too much leeway

Friedman: Autonomous tech not ready.

Safety groups, while supportive of autonomous vehicles and their benefits in principle, say Congress has deferred too much to the auto industry in establishing a legal and regulatory framework for self-driving vehicles.

Developers want certainty and consistency they say will allow greater investment and faster innovation, and with fierce global competition to capture the lead in the large burgeoning market for driverless vehicles, Congress appears willing to deliver conditions favorable to U.S.-based businesses.

"There's an implicit assumption that autonomous vehicles will be safe, and in the long run, especially if the right protections are in place, I think they will be a lot safer for human drivers," David Friedman, director of cars and policy analysis at Consumers Union, the advocacy division of Consumer Reports, told Automotive News. "But that's the endpoint. The question is, what is going to happen in the next five to 10 years?"

Bills before both chambers of Congress would allow companies to test tens of thousands of self-driving vehicles -- which are not in compliance with existing safety standards -- on public roads. Automakers could make inoperable existing safety equipment without having to prove that the software and technology designed to perform the same function works. They would only have to submit voluntary safety reports rather than meet binding information requirements. As with the DOT guidelines, the bills would pre-empt states from setting autonomous vehicle design, construction and performance standards during testing.

AV START

The House passed a bill last fall, but the AV START Act has been stymied in the Senate by five Democrats who have taken advantage of procedural rules to block a vote unless their concerns about safety, cybersecurity and data privacy are addressed. In the wake of the Uber tragedy, public interest groups are demanding the Senate delay any vote until investigations are completed and the National Transportation Safety Board makes recommendations.

For safety advocates the problem is threefold: no explicit test requirements for driverless cars, exemptions from key safety requirements and NHTSA not requiring developers to submit safety data.

The light regulatory touch is risky when the technology is not yet perfected and interaction between self-driving cars and humans is growing, Friedman said.

It's reasonable to adapt regulations designed for human-controlled vehicles for vehicles without brake pedals and steering wheels, "but the solution is to work with NHTSA to establish a new testing protocol. Instead it seems like several technology and auto companies are in such a rush to get these vehicles out on the road that they are not putting safety first," he said in an interview two weeks before the Uber accident. "They are putting first, first. And that is a recipe for somebody getting hurt."

Friedman, a former acting NHTSA administrator, said autonomous vehicle developers need to understand that people are very wary of placing their safety in the hands of robot cars.

"I really want self-driving cars to succeed. Their potential to save lives, help people get around who never got around before is amazing, but if a company or two messes up, causes a significant crash, or multiple smaller crashes, and the public doesn't gain trust, it could be delayed for years or decades. I don't think we can afford that. That's why you've got to get it right first," he said.

Aviation risk

Auto companies need to take a page from the aviation industry, which realizes any crash is bad for business, Friedman said.

The last fatal airline crash of a commercial passenger jet in the U.S. was nine years ago.

"When you're a consumer and you're not in control, your risk tolerance is much, much lower," Friedman said. "So, while we as a nation sadly tolerated 37,461 deaths in 2016, once the computer is driving the car, the corporation is driving the car, I don't expect consumers would tolerate that, or even half or a quarter of that. I don't know what the right number is. Maybe it's a 10th. Our society's tolerance for fatalities when we are not the ones in the driver's seat is really low."

The former safety regulator said Waymo and General Motors are taking a more responsible, cautious approach to testing than companies such as Tesla, which has also experienced a fatal accident involving an owner using Autopilot mode in a Model S who didn't pay attention to the road ahead. Despite multiple crashes, Tesla CEO Elon Musk's philosophy is to roll out autonomous features to its cars when the company decides they are ready for everyday use.

Friedman said Google's self-driving car project and GM are conducting more controlled, low-risk experiments. GM, for example, monitors its test drivers to make sure they are engaged and allows vehicles to operate only where it is safe.

But getting full transparency into the level of safety is still difficult, he added, because the companies publicly share only very macro data on miles driven and crashes. Regulators and outside experts don't have access to the number of near-misses, crash details and other data.

"The driver always needs to be paying attention, and as a company you should not let that vehicle be driven outside of where it's safe," Friedman said. "If the sensors are not good enough to tell where that vehicle is going is safe or not, then you have to question the overall safety of their approach."

You can reach Eric Kulisch at ekulisch@crain.com

25

Shares