Tesla's autopilot technology faces fresh scrutiny

This renewed scrutiny arrives at a critical time for Tesla. After reaching a record high this year, its share price has fallen about 20% amid signs that the company's electric cars are losing market share to traditional automakers.

By Neal E. Boudette
Published: Mar 24, 2021

A file photo showing Tesla's 2016 overhaul of the car's operating system, known as Tesla 8.0, where the biggest change was how Autopilot shifts towards a heavier reliance on its radar than its camera to guide the car through traffic.
Image: Christopher Goodney/Bloomberg via Getty Images​

Tesla faced numerous questions about its Autopilot technology after a Florida driver was killed in 2016 when the system of sensors and cameras failed to see and brake for a tractor-trailer crossing a road.

Now the company is facing more scrutiny than it has in the past five years for Autopilot, which Tesla and its chief executive, Elon Musk, have long maintained makes its cars safer than other vehicles. Federal officials are looking into a series of recent accidents involving Teslas that either were using Autopilot or might have been using it.

The National Highway Traffic Safety Administration confirmed last week that it was investigating 23 such crashes. In one accident this month, a Tesla Model Y rear-ended a police car that had stopped on a highway near Lansing, Michigan. The driver, who was not seriously injured, had been using Autopilot, the police said.

In February in Detroit, under circumstances similar to the 2016 Florida accident, a Tesla drove beneath a tractor-trailer that was crossing the road, tearing the roof off the car. The driver and a passenger were seriously injured. Officials have not said whether the driver had turned on Autopilot.

NHTSA is also looking into a Feb. 27 crash near Houston in which a Tesla ran into a stopped police vehicle on a highway. It is not clear if the driver was using Autopilot. The car did not appear to slow before the impact, the police said.

Read More

Autopilot is a computerized system that uses radar and cameras to detect lane markings, other vehicles and objects in the road. It can steer, brake and accelerate automatically with little input from the driver. Tesla has said it should be used only on divided highways, but videos on social media show drivers using Autopilot on various kinds of roads.

“We need to see the results of the investigations first, but these incidents are the latest examples that show these advanced cruise-control features Tesla has are not very good at detecting and then stopping for a vehicle that is stopped in a highway circumstance,” said Jason Levine, executive director of the Center for Auto Safety, a group created in the 1970s by the Consumers Union and Ralph Nader.

This renewed scrutiny arrives at a critical time for Tesla. After reaching a record high this year, its share price has fallen about 20% amid signs that the company’s electric cars are losing market share to traditional automakers. Ford Motor’s Mustang Mach E and the Volkswagen ID. 4 recently arrived in showrooms and are considered serious challengers to the Model Y.

The outcome of the current investigations is important not only for Tesla but for other technology and auto companies that are working on autonomous cars. While Musk has frequently suggested the widespread use of these vehicles is near, Ford, General Motors and Waymo, a division of Google’s parent, Alphabet, have said that moment could be years or even decades away.

Bryant Walker Smith, a professor at the University of South Carolina who has advised the federal government on automated driving, said it was important to develop advanced technologies to reduce traffic fatalities, which now number about 40,000 a year. But he said he had concerns about Autopilot, and how the name and Tesla’s marketing imply drivers can safely turn their attention away from the road.

“There is an incredible disconnect between what the company and its founder are saying and letting people believe, and what their system is actually capable of,” he said.

Tesla, which disbanded its public relations department and generally does not respond to inquiries from reporters, did not return phone calls or emails seeking comment. And Musk did not respond to questions sent to him on Twitter.

The company has not publicly addressed the recent crashes. While it can determine if Autopilot was on at the time of accidents because its cars constantly send data to the company, it has not said if the system was in use.

The company has argued that its cars are very safe, claiming that its own data shows that Teslas are in fewer accidents per mile driven and even fewer when Autopilot is in use. It has also said it tells drivers that they must pay close attention to the road when using Autopilot and should always be ready to retake control of their cars.

A federal investigation of the 2016 fatal crash in Florida found that Autopilot had failed to recognize a white semi trailer against a bright sky, and that the driver was able to use it when he wasn’t on a highway. Autopilot continued operating the car at 74 mph even as the driver, Joshua Brown, ignored several warnings to keep his hands on the steering wheel.

A second fatal incident took place in Florida in 2019 under similar circumstances — a Tesla crashed into a tractor-trailer when Autopilot was engaged. Investigators determined that the driver had not had his hands on the steering wheel before impact.

While NHTSA has not forced Tesla to recall Autopilot, the National Transportation Safety Board concluded that the system “played a major role” in the 2016 Florida accident. It also said the technology lacked safeguards to prevent drivers from taking their hands off the steering wheel or looking away from the road. The safety board reached similar conclusions when it investigated a 2018 accident in California.

By comparison, a similar General Motors' system, Super Cruise, monitors a driver’s eyes and switches off if the person looks away from the road for more than a few seconds. That system can be used only on major highways.

In a Feb. 1 letter, the chairman of the NTSB, Robert Sumwalt, criticized NHTSA for not doing more to evaluate Autopilot and require Tesla to add safeguards that prevent drivers from misusing the system.

The new administration in Washington could take a firmer line on safety. The Trump administration did not seek to impose many regulations on autonomous vehicles and sought to ease other rules the auto industry did not like, including fuel-economy standards. By contrast, President Joe Biden has appointed an acting NHTSA administrator, Steven Cliff, who worked at the California Air Resources Board, which frequently clashed with the Trump administration on regulations.

Concerns about Autopilot could dissuade some car buyers from paying Tesla for a more advanced version, Full Self-Driving, which the company sells for $10,000. Many customers have paid for it in the expectation of being able to use it in the future; Tesla made the option operational on about 2,000 cars in a “beta” or test version starting late last year, and Musk recently said the company would soon make it available to more cars. Full Self-Driving is supposed to be able to operate Tesla cars in cities and on local roads where driving conditions are made more complex by oncoming traffic, intersections, traffic lights, pedestrians and cyclists.

Despite their names, Autopilot and Full Self-Driving have big limitations. Their software and sensors cannot control cars in many situations, which is why drivers have to keep their eyes on the road and hands on or close to the wheel.

In a November letter to California’s Department of Motor Vehicles that recently became public, a Tesla lawyer acknowledged that Full Self-Driving struggled to react to a wide range of driving situations and should not be considered a fully autonomous driving system.

The system is “not capable of recognizing or responding” to certain “circumstances and events,” Eric C. Williams, Tesla’s associate general counsel, wrote. “These include static objects and road debris, emergency vehicles, construction zones, large uncontrolled intersections with multiple incoming ways, occlusions, adverse weather, complicated or adversarial vehicles in the driving paths, unmapped roads.”

Levine of the Center for Auto Safety has complained to federal regulators that the names Autopilot and Full Self-Driving are misleading at best and could be encouraging some drivers to be reckless.

“Autopilot suggests the car can drive itself and, more importantly, stop itself,” he said. “And they doubled down with Full Self-Driving, and again that leads consumers to believe the vehicle is capable of doing things it is not capable of doing.”

©2019 New York Times News Service

X