Federal authorities found that self-driving AI systems were involved in nearly 400 car accidents last year, with Teslas responsible for 70 percent of the accidents.
The New York Times reported that the National Highway Traffic Safety Administration on Wednesday shared the results of large-scale data collection and analysis to better understand the effectiveness and safety of self-driving systems on the road today.
According to the report, there were 392 accidents involving self-driving technologies between July 1 last year and May 15 this year. Six people were killed and five seriously injured in these accidents.
Of those 392, Tesla’s Autopilot, Full Self-Driving mode, or at least some component of its self-driving technology, was active in 273 of those accidents.
Self-driving systems — whether it’s a near-full autopilot mode that allows hands-free driving or simply a parking aid — are becoming more common in U.S. vehicles every year. Their growing presence on US roads prompted regulators to review their safety record.
“These technologies show promise for improving safety, but we need to understand how these vehicles behave in real-world situations,” said Dr. Steven Cliff, the administrator of NHTSA. “This will help our investigators quickly identify potential error trends.”
dr Cliff spoke to reporters about the agency’s findings on Wednesday, but cautioned against drawing too many conclusions from the numbers just yet. He noted that the data does not contain any information that could help further contextualize the data, such as B. How many self-driving cars are in operation per manufacturer.
Ford, GM and other manufacturers have launched new vehicle models with self-driving technology, but it’s unclear how many of these types of vehicles are actually on the road. Tesla has 830,000 autopilot-enabled cars on the road, which may explain why they account for nearly 70 percent of reported accidents.
“The data may raise more questions than it answers,” said Dr. Cliff.
He said the agency will continue to collect data to better understand the technology and its potential dangers.
Tesla’s self-driving technology was reviewed by the NHTSA before the results were released. The company was the focus of a “preliminary assessment” by the agency, which analyzes accidents involving the cars and how their self-driving technology affected those incidents. After the agency found “patterns in system performance and associated driver behavior” that contributed to the accidents, the agency recommended an upgrade from a preliminary assessment to a “technical analysis” that often precedes a recall.
The agency noted that in some cases, Tesla Autopilot was disabled less than a second before the crash.
“The agency’s analysis of these sixteen affected first responder and road maintenance vehicle accidents found that in most incidents, Forward Collision Warnings (FCW) were activated immediately prior to impact and that subsequent Automatic Emergency Braking (AEB) intervened in about half of the collisions. On average, in these accidents, the autopilot aborted vehicle control less than a second before the initial impact,” the agency’s report said.