Nearly 400 accidents in the U.S. in 10 months involved cars using advanced driver assistance technologies, the federal government’s top car safety regulator revealed on Wednesday.
The findings are part of a comprehensive effort by the National Highway Traffic Safety Administration to determine the safety of advanced driving systems as they become more commonplace.
In 392 incidents reported by the agency from July 1 last year to May 15, six people died and five were seriously injured. Teslas running on Autopilot, the most ambitious autopilot mode or any of its associated component features had 273 crashes. Five of those Tesla crashes were fatal.
The data was collected under an NHTSA order last year requiring carmakers to report car accidents with advanced driver assistance systems. Manufacturers have developed these systems in recent years, including features that allow you to get your hands off the steering wheel under certain conditions and help you park in parallel.
The NHTSA’s order was an unusually bold move for the regulator, which has been criticized in recent years for not being more assertive with carmakers.
“Until last year, NHTSA’s response to autonomous vehicles and driver assistance has been frankly passive,” said Matthew Wansley, a professor at Cardozo School of Law in New York who specializes in emerging automotive technologies. . “This is the first time the federal government has directly collected accident data on these technologies.”
Speaking to reporters ahead of Wednesday’s publication, NHTSA Administrator Steven Cliff said the data, which the agency will continue to collect, “will help our researchers quickly identify potential trends in defects that arise.” .
Dr. Cliff said NHTSA would use this data as a guide to make any rules or requirements for its design and use. “These technologies promise to improve safety, but we need to understand how these vehicles work in real-world situations,” he said.
But he warned not to draw conclusions from the data collected so far, noting that it does not take into account factors such as the number of cars from each manufacturer that are on the road and equipped with such technologies.
An advanced driver assistance system can steer, brake and accelerate vehicles on its own, although drivers must be alert and prepared to take control of the vehicle at any time.
Safety experts are concerned that these systems allow drivers to give up active control of the car and may make them think that their cars are driving themselves. When technology is malfunctioning or unable to handle a particular situation, drivers may not be ready to take control quickly.
Some 830,000 Tesla cars in the United States are equipped with Autopilot or other company driver assistance technologies, which provides an explanation for why Tesla vehicles accounted for nearly 70 percent of crashes reported in data released Wednesday.
Ford Motor, General Motors, BMW and others have similar advanced systems that allow hands-free driving in certain road conditions, but far fewer of these models have been sold. These companies, however, have sold millions of cars over the past two decades that are equipped with individual components of driver assistance systems. Components include so-called lane keeping, which helps drivers stay in their lanes, and adaptive cruise control, which adjusts a car’s speed and brakes automatically when traffic in front decreases.
In a statement on Wednesday, NHTSA revealed that Honda vehicles were involved in 90 incidents and Subarus in 10. Ford, GM, BMW, Volkswagen, Toyota, Hyundai and Porsche reported five or fewer each.
Data includes cars with systems designed to operate with little or no driver intervention, and separate data on systems that can simultaneously steer and control the speed of the car but require constant driver attention.
Automated vehicles, which are still under development for the most part but are being tested on public roads, were involved in 130 incidents, NHTSA found. One was seriously injured, 15 slightly or moderately injured and 108 uninjured. Many of the collisions with automated vehicles were bumper benders or bumper taps, as they were operated primarily at low speed and in the city.
In more than a third of the 130 automated system accidents, the car was stopped and hit by another vehicle. In 11 crashes, a car equipped with this technology went straight and collided with another vehicle changing lanes, the data showed.
Most of the incidents involving advanced systems occurred in San Francisco or the Bay Area, where companies such as Waymo, Argo AI and Cruise are testing and refining the technology.
Waymo, which is owned by Google’s parent company and operates a fleet of driverless taxis in Arizona, was among 62 incidents. Cruise, a division of GM, participated in 23. Cruise has just begun offering driverless taxi rides in San Francisco, and this month received permission from the California authorities to start charging passengers.
None of the cars using the automated systems were involved in fatal crashes and only one crash caused serious injuries. In March, a cyclist struck a Cruise-operated vehicle from behind while they were both walking down a San Francisco street.
NHTSA’s order for carmakers to send the data was driven in part by accidents and fatalities over the past six years involving Teslas operating on Autopilot. Last week, NHTSA expanded its investigation into whether Autopilot has technological and design defects that pose safety risks.
The agency has been investigating 35 accidents that occurred while the autopilot was being activated, including nine that resulted in 14 deaths since 2014. It had also opened a preliminary investigation into 16 incidents in which Teslas under the control of the autopilot crashed into emergency vehicles that had stopped and had their lights flashing.
In November, Tesla recalled nearly 12,000 vehicles that were part of the beta test of Full Self Driving, a version of Autopilot designed for use on city streets, after deploying a software update that, according to the company, could cause accidents due to unexpected activation of cars. emergency braking system.
The NHTSA order required companies to provide accident data when using advanced driver assistance systems and automated technologies within 30 seconds after impact. While these data provide a broader picture of the behavior of these systems than ever before, it is still difficult to determine whether they reduce accidents or improve safety.
The agency has not collected data that allows researchers to easily determine whether using these systems is safer than shutting them down in the same situations. Car manufacturers were able to write descriptions of what happened during the accidents, an option that Tesla, as well as Ford and others regularly used, making it difficult to interpret the data.
Some independent studies have explored these technologies, but have not yet shown whether they reduce accidents or improve safety.
J. Christian Gerdes, a professor of mechanical engineering and director of the Center for Automotive Research at Stanford University, said the data released Wednesday was useful, to some extent. “Can we learn more from this data? Yes,” he said. “Is it an absolute goldmine for researchers?” I don’t see that. “
Because of the wording, he said, it was difficult to assess the final usefulness of the findings. “NHTSA understands this data much better than the general public can get just by looking at what was published,” he said.
Dr. Cliff, the NHTSA administrator, was careful to act on the results. “The data may raise more questions than they answer,” he said.
But some experts said the new information available should encourage regulators to be more assertive.
“The NHTSA can and should use its various powers to do more: regulation, star ratings, research, additional queries, and soft influence,” said Bryant Walker Smith, an associate professor in engineering and law schools. the University of South Carolina, which specializes in emerging schools. transportation technologies.
“These data could also lead to more voluntary and involuntary disclosures,” he added. “Some companies may gladly provide more context, especially on mileage,” avoided “accidents and other indicators of good performance. Lawyers will look for patterns and even cases in this data.”
However, he said, “this is a good start.”
Jason Kao, Asmaa Elkeurti and Vivian Li provided research and reports.