The death toll from accidents involving the failure of Tesla’s much maligned Autopilot semi-autonomous driving technology, has reached 17, according to a new data from the National Highway Traffic Safety Administration.
The Washington Post reported it gleaned the rising numbers from reports compiled by NHTSA. Despite the rising number of fatalities, Tesla CEO Elon Musk continues to defend two technologies, Autopilot and Full Self-Driving, routinely prodding Tesla owners to use them.
“There’ll be a little bit of two steps forward, one step back between releases for those trying the beta. But the trend is very clearly towards full self-driving, towards full autonomy. And I hesitate to say this, but I think we’ll do it this year. So that’s what it looks like. Yes,” he said during a conference call with analysts and investors back in April when asked about the status of FSD, which is FSD is the more advanced of the two systems.
Autopilot being scrutinized
However, Autopilot is at the center of an ongoing federal safety investigation. The Post, however, reported over the weekend there have 736 crashes and 17 fatalities in the U.S. since 2019 involving Teslas in Autopilot mode — far more than previously reported.
The figures come from the Post’s analysis of NHTSA data, which also showed Teslas were involved in the vast majority of the more than 800 accidents tallied in the report.
The number of these type of crashes has surged in the past four years, the data shows, reflecting the hazards associated with increasing use of Tesla’s driver-assistance technology as well as the growing presence of Tesla on the nation’s highways as the sales of the company’s electric vehicles have steadily increased, according to the Post.
“When authorities first released a partial accounting of accidents involving Autopilot in June 2022, they counted only three deaths definitively linked to the technology. The most recent data includes at least 17 fatal incidents, 11 of them since May 2022, and five serious injuries,” the Post reported.
Tesla, which has no media relations or public relations department, had no comment on the Post report.
Tesla ducks’ responsibility
However, the Tesla website does carry a disclaimer stating, “Current Autopilot features require active driver supervision and do not make the vehicle autonomous,” the company’s branding has been accused of misleading drivers of their vehicles’ capabilities.
Tesla also recently prevailed in a lawsuit in which the plaintiff tried to blame the company’s Autopilot program for a 2019 crash.
The jurors in the California case found the software wasn’t at fault in a crash where the car turned into a median on a city street while Autopilot was engaged. They basically upheld the legal precedent developed during the past century of motoring that any human driver is responsible for the operation of their vehicles.”
Critics argue Musk and Tesla, by choosing names like Autopilot and full self-driving gives drivers a false sense of security. Other automakers, which now offer similar technology such as General Motors, Ford and Mercedes-Benz, are careful to avoiding hyping the safety benefits of the driver assistance features, which allow the driver to remove their hands from the wheel under certain circumstances.
Musk has insisted cars using FSD are safer than those piloted solely by human drivers, citing crash rates when the modes of driving are compared, a claim made by no other automaker.
But the Post found four of the fatal accidents involved a motorcycle, while another involved an emergency vehicle, which in theory, the system has been taught to avoid.
Musk also has repeatedly defended his decision to push driver-assistance technologies to Tesla owners, arguing that the benefit outweighs the harm.
Said Musk, “At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it even though you’re going to get sued and blamed by a lot of people,” Musk said last year. “Because the people whose lives you saved don’t know that their lives were saved. And the people who do occasionally die or get injured, they know.”
Last year, though, the “Dawn Project” bought a full-page ad in the New York Times that described “Full Self-Driving” software as “the worst software ever sold by a Fortune 500 company.”
17 Autopilot deaths vs how many thousands of human error deaths?
How many of the 17 deaths are because the drivers were not watching the road?
Even pilots keep alert when they engage Autopilot in planes where there is almost no chance of hitting anything.