By TOM KRISHER – AP Auto Writer
DETROIT (AP) — A U.S. investigation into Teslas running on partially automated driving systems that crashed into parked emergency vehicles has moved closer to a recall.
The National Highway Traffic Safety Administration said Thursday it was upgrading the probe to technical analysis, another sign of the maker’s increased scrutiny of electric vehicles and automated systems that perform at least some driving tasks.
A technical analysis is the final step in an investigation, and in most cases NHTSA decides within a year whether there should be a recall or if the investigation should be closed.
Documents released Thursday by the agency raise serious concerns about Tesla’s Autopilot system. The agency has found that it is used in areas where its capabilities are limited and that many drivers take no action to avoid accidents despite warnings from the vehicle.
The agency said it reported 16 crashes in emergency vehicles and trucks with warning signs, resulting in 15 injuries and one death.
People also read…
The probe now covers 830,000 vehicles, nearly all the Austin, Texas automaker has sold in the United States since the start of the 2014 model year.
Investigators will evaluate additional data, vehicle performance and “explore the extent to which Autopilot and related Tesla systems may exacerbate human factors or behavioral safety risks compromising the effectiveness of driver supervision,” the agency said. ‘agency.
In the majority of the 16 crashes, the Teslas issued forward collision alerts to drivers just before impact. Automatic emergency braking intervened to at least slow the cars in about half of the cases. On average, the autopilot gave up control of the Teslas less than a second before the crash, according to NHTSA documents.
In documents detailing the technical analysis, NHTSA wrote that it was also looking at crashes involving similar patterns that did not include emergency vehicles or trucks with warning signs.
The agency found that in many cases drivers had their hands on the wheel but failed to take action to avoid an accident. “This suggests that drivers may be compliant with the driver engagement strategy as designed,” the agency wrote.
Investigators also wrote that a driver’s use or misuse of the driver monitoring system “or driving a vehicle unintentionally does not necessarily prevent a fault in the system.”
The agency will have to decide if there is a safety defect before pursuing a recall.
In total, the agency reviewed 191 crashes but removed 85 because other drivers were involved or there was not enough information to make a definitive assessment. Of the remaining 106, the primary cause of the crash appears to be running the autopilot in areas where it has limitations or conditions that may interfere with its operations. “For example, operation on roads other than limited access highways, or operation in low traction or visibility environments such as rain, snow or ice.”
NHTSA began its investigation in August last year after a series of crashes since 2018 in which Teslas using the company’s Autopilot or Traffic Aware Cruise Control systems collided with vehicles at scenes where the first responders used flashing lights, flares, an illuminated arrow sign, or hazard cone warnings.
Copyright 2022 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.