US will investigate Tesla's Autopilot system over crashes with emergency vehicles

The investigation was prompted by at least 11 accidents in which Teslas using Autopilot, an assisted-driving system that can steer, accelerate and brake on its own, drove into parked firetrucks, police cars and other emergency vehicles

By Neal E. Boudette and Niraj Chokshi
Published: Aug 17, 2021

One critical issue that investigators will focus on is how Autopilot ensures that Tesla drivers are paying attention to the road and are prepared to retake control of their cars in case the system fails to recognize and brake for something

Image: Arnd Wiegmann / Reuters

The U.S. auto safety regulator said Monday that it has opened a broad investigation of the Autopilot system used in hundreds of thousands of Tesla’s electric cars.

The investigation was prompted by at least 11 accidents in which Teslas using Autopilot, an assisted-driving system that can steer, accelerate and brake on its own, drove into parked firetrucks, police cars and other emergency vehicles, the safety agency, the National Highway Traffic Safety Administration, disclosed. Those crashes killed one person and injured 17 others.

Safety experts and regulators have been scrutinizing Autopilot since the first fatal accident involving the system was reported in 2016, in which the driver of a Tesla Model S was killed when his car struck a tractor-trailer in Florida. In that case, the safety agency concluded there were no defects — a position it stuck to for years even as the number of crashes and deaths involving Autopilot climbed.

On Monday, the agency appeared to change course. The investigation is the broadest look yet at Autopilot and at potential flaws that could make it and the Teslas that operate on it dangerous.

Read More

Depending on its findings, the safety agency could force Tesla to recall cars and make changes to the system. It also has the authority to force automakers to add safety devices and features to their cars, such as when it required rearview cameras and air bags.

One critical issue that investigators will focus on is how Autopilot ensures that Tesla drivers are paying attention to the road and are prepared to retake control of their cars in case the system fails to recognize and brake for something. The company’s owner’s manuals instruct drivers to keep their hands on the steering wheel, but the system continues operating even if drivers only occasionally tap the wheel.

“Driver monitoring has been a big deficiency in Autopilot,” said Raj Rajkumar, an engineering professor at Carnegie Mellon University who focuses on autonomous vehicles. “I think this investigation should have been initiated some time ago, but it’s better late than never.”

Tesla, the world’s most valuable automaker by far, and its charismatic and brash chief executive, Elon Musk, have said that Autopilot is not flawed, insisting that it makes cars much safer than others on the road. They have dismissed warnings from safety experts and the National Transportation Safety Board that have been critical of how the company has designed Autopilot.

The company and Musk, who comments frequently on Twitter, did not respond to messages seeking comment on Monday and issued no public statements about the new investigation.

Musk has previously been dismissive of the idea that Tesla’s advanced driver-assistance system ought to monitor drivers, and he said in 2019 that human intervention could make such systems less safe.

His views stand in stark contrast to the approach General Motors and other automakers have taken. GM, for example, offers a driver-assistance system known as Super Cruise on a few models. The system allows drivers to take their hands off the steering wheel but uses an infrared camera to monitor drivers’ eyes to ensure that they are looking at the road.

The safety agency said it would also examine how Autopilot identifies objects on the road and under what conditions Autopilot can be turned on. Tesla tells drivers to use the system only on divided highways, but it can be used on smaller roads and streets. GM uses GPS to restrict Super Cruise’s use to major highways that do not have oncoming or cross traffic, intersections, pedestrians and cyclists.

Tesla’s Autopilot system appears to have difficulty detecting and braking for parked cars generally, including private cars and trucks without flashing lights. In July, for example, a Tesla crashed into a parked sport utility vehicle. The driver had Autopilot on, had fallen asleep and later failed a sobriety test, the California Highway Patrol said.

The safety agency’s investigation will look at all models of Teslas — Y, X, S and 3 — from the 2014 to 2021 model years, totaling 765,000 cars, a large majority of the vehicles the company has made in the United States.

The new investigation comes on top of reviews the safety agency is conducting of more than two dozen crashes involving Autopilot. The agency has said eight of those crashes resulted in a total of 10 deaths. Those investigations are meant to delve into the details of individual cases to provide data and insights that the agency and automakers can use to improve safety or identify problem areas.

Tesla has acknowledged that Autopilot can sometimes fail to recognize stopped emergency vehicles. And safety experts, videos posted on social media and Tesla drivers themselves have documented a variety of weaknesses of Autopilot.

In some accidents involving the system, drivers of Teslas have been found asleep at the wheel or were awake but distracted or disengaged. A California man was arrested in May after leaving the driver’s seat of his Tesla while it was on Autopilot; he was sitting in the back of his car as it crossed the San Francisco-Oakland Bay Bridge.

The National Transportation Safety Board, which investigates accidents but cannot force automakers to make changes, has called on the National Highway Traffic Safety Administration to take stronger action on regulating Autopilot and other advanced driver-assistance systems. Last year, the transportation safety board said in a report that Tesla’s “ineffective monitoring of driver engagement” contributed to a 2018 crash that killed Wei Huang, the driver of a Model X that hit a highway barrier in Mountain View, California.

After that report came out, the transportation safety board’s chairman at the time, Robert L. Sumwalt, called on the highway safety administration “to fulfill its oversight responsibility to ensure that corrective action is taken.”

“It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars,” he said.

In a statement Monday, the transportation safety board said NHTSA’s investigation of Autopilot was “a positive step forward for safety.”

©2019 New York Times News Service

X