The National Highway Transportation Safety Administration said seven of the crashes resulted in 17 injuries and one death.
All of the Teslas in question had the function of autopilot or traffic-sensitive cruise control activated as crashes approached, the NHTSA said.
The accidents under investigation occurred between January 22, 2018 and July 10, 2021, in nine different states. They took place mostly at night, and the post-crash scenes all included control measures such as first responder vehicle lights, flares, an illuminated arrow sign, and road cones.
Tesla did not immediately respond to a request for comment on the probe.
Tesla has sought to offer comprehensive autonomous driving technology to its drivers. But although he says his data shows cars using autopilot have fewer accidents per mile than cars driven by drivers, he warns that “current autopilot functionality requires active driver supervision and does not require active driver supervision. not make the vehicle autonomous “.
The safety agency said its investigation would allow it to “gain a better understanding of the causes of some Tesla crashes,” including “the technologies and methods used to monitor, assist and enforce driver engagement in driving during. the use of the autopilot “. It will also look at the contributing factors to accidents.
“NHTSA reminds the public that no motor vehicle commercially available today is capable of driving itself,” the agency said in a statement. “Every available vehicle requires a human driver in control at all times, and all state laws hold human drivers accountable for the operation of their vehicles. Some advanced driver assistance features can promote safety by helping drivers avoid crashes and lessen the severity of crashes that do occur, but as with all technologies and equipment in motor vehicles, drivers should use them correctly and responsibly. “
The survey covers the Tesla Y, X, S and 3 with model years 2014 to 2021.
Gordon Johnson, an outspoken Tesla analyst and critic, wrote in a note to customers on Monday that the issue did not only affect autopilot users, but also other non-Tesla drivers on the road who could be injured by cars using this feature.
“NHTSA is focusing on a particular danger Tesla creates for people outside the vehicle, that is, those who have never agreed to be ‘guinea pigs’ on autopilot,” wrote Johnson. “So just saying ‘Tesla drivers accept the risks of autopilot’, as has been used in the past, doesn’t seem like a defense here.”
Autonomous driving options such as Tesla’s autopilot or the more widely available adaptive cruise control, available on a wide range of vehicles from automakers, allow a vehicle to slow down when the car ahead is slowing down, said Sam Abuelsamid, a expert. in Autonomous Vehicles and Senior Analyst at Guidehouse Insights.
But Abuelsamid said these vehicles are designed to ignore stationary objects when traveling above 40 mph so they don’t slam the brakes when approaching overpasses or other stationary objects on the edge of the road. road, like a car stopped on the shoulder. Fortunately, most of those vehicles with some sort of automatic braking stop for stationary objects when they move slower, said Abuelsamid.
The real problem, he said, is that many more Tesla owners assume their cars can, in fact, drive themselves than drivers of other cars with automatic braking and other safety features. And the signals a driver would see when approaching a crash site, such as hazard lights or flashing lights, make more sense to a human than they could to a control system. car driving.
“When it works, which can be most of the time, it can be really good,” Abuelsamid said of Tesla’s autopilot feature. “But it can easily be mistaken for things that humans would have no problem with. Artificial visions aren’t as adaptive as humans. And the problem is, all machine systems sometimes make stupid mistakes.”
You Can Read Also