Many of the drivers who died in Chevrolet Cobalts and other General Motors cars with faulty ignition switches were speeding, drunk or otherwise reckless. Some were all of the above.
For years, crash investigators and even the victims' own families blamed their deaths on those mistakes. It wasn't until GM admitted that the cars were defective in 2014 that the truth became clear.
Even then, the automaker had plenty of defenders who insisted that the matter was overblown. It's not GM's fault, they argued, if someone dies because they careened through a cul-de-sac and into a tree at 70 mph.
For sure, bad judgment and human error played a role in a good number of the 124 deaths officially linked to defective GM ignition switches. But that doesn't excuse the fact that GM failed its customers — first by building vehicles that didn't work properly, and then by not recalling them for years. GM even helped keep the defect hidden by confidentially settling lawsuits filed by victims' relatives over nearly a decade.
Now federal safety regulators are finally asking tough questions of Tesla, which has long been under fire from critics for its Autopilot system and its new Full Self-Driving software.
NHTSA is investigating the electric-vehicle maker in connection with 12 crashes involving Autopilot and scenes with emergency responders. YouTube also is littered with videos of Autopilot-enabled Teslas having near-misses with pedestrians, barricades and other obstacles that wouldn't fool a human driver.
Tesla's staunch cult of defenders has reflexively pointed fingers anywhere except at the company or CEO Elon Musk. The drivers were not paying attention or were using the system improperly, Musk's congregants cry. They were drunk or nodding off behind the wheel. They want Tesla to fail. Anything to deflect scrutiny from Tesla itself.
The excuses feel awfully familiar.
So do the nondisclosure agreements that Tesla has been requiring of customers who want early access to the Full Self-Driving software, clearly to prevent users from publicizing problems they experience.
For sure, Tesla customers bear responsibility for improper use of their vehicles. Some day, fully autonomous vehicles may nearly eliminate the potential for driver error to cause a crash.
But that day is not here yet.
And until it arrives, Tesla customers — whether they're good drivers or not — need to be able to count on the company and on Musk to act responsibly. The rest of us — everyone who shares the same roads and didn't agree to be a software beta tester during our daily errands and commutes — must be able to count on that, too.