NTSB Chair Wants Tesla to Limit Where Autopilot Can Operate

Autopilot has frequently been misused by Tesla drivers.

This July 8, 2018 photo shows Tesla 2018 Model 3 sedans sit on display outside a Tesla showroom in Littleton, Colo. Tesla wants to keep secret its response to the U.S. government’s request for information in an investigation of its Autopilot partially automated driving system.
This July 8, 2018 photo shows Tesla 2018 Model 3 sedans sit on display outside a Tesla showroom in Littleton, Colo. Tesla wants to keep secret its response to the U.S. government’s request for information in an investigation of its Autopilot partially automated driving system.
AP Photo/David Zalubowsi, File

DETROIT (AP) — The head of the U.S. National Transportation Safety Board is calling on Tesla to act on recommendations to limit where its Autopilot driver-assist system can operate and to put a system in place to make sure drivers are paying attention.

In a letter sent to Tesla CEO Elon Musk on Monday, Chairwoman Jennifer Homendy says the electric vehicle maker has not responded to the agency's recommendations issued four years ago.

Homendy also says company statements that safety is the primary design requirement for Tesla are undercut by the rollout of “Full Self-Driving” software to customers who test it on public roads. The tests are being done “without first addressing the very design shortcomings” that allowed three fatal Tesla crashes that were investigated by the NTSB, she wrote.

The NTSB investigates crashes but has no regulatory authority. It can only make recommendations to automakers or other federal agencies such as the National Highway Traffic Safety Administration.

Messages were left Monday seeking comment from Tesla.

“If you are serious about putting safety front and center in Tesla vehicle design, I invite you to complete action on the safety recommendations we issued to you four years ago,” Homendy wrote.

The agency, she wrote, has long advocated for multiple technologies to prevent crashes and save lives, “but it’s crucial that such technology is implemented with the safety of all road users foremost in mind.”

Homendy wrote that her agency appreciates Tesla's cooperation as it investigates other fatal Tesla crashes in Texas and Florida.

She pointed out that the agency found that the driver in a 2016 crash in Williston, Florida, ran his car on Autopilot on roads where it wasn't designed to operate safely. The NTSB also determined that Autopilot didn't effectively monitor the driver to make sure he was paying attention.

Tesla has said that Autopilot and “Full Self-Driving” are driver assist systems and cannot drive themselves, despite their names. It says drivers should always pay attention and be ready to take action.

The NTSB made the recommendations in 2017 to Tesla and five other automakers. The other five responded describing what action they would take, but Tesla did not officially respond, Homendy wrote.

The letter comes as federal agencies step up pressure on Tesla over its partially automated driving systems. It comes just hours after NHTSA posted a document showing that Tesla wants to keep secret its response to the agency's investigation of Autopilot.

The electric vehicle maker sent the agency a partial response by a Friday deadline. The agency is investigating how Autopilot detects and responds to emergency vehicles parked on highways.

In a document posted on its website Monday, the agency says it is reviewing Tesla's response, and that Tesla has asked that its whole submission be treated as confidential business information.

Companies often ask that some information be kept confidential when they respond to the agency, but seldom does it allow entire documents to be kept secret. Much of the time the documents are heavily redacted before being placed in public files.

In August the safety agency made a detailed information request to Tesla in an 11-page letter that is part of a wide-ranging investigation into how Autopilot behaves when first responder vehicles are parked while crews deal with crashes or other hazards.

The agency wants to know how Teslas detect a crash scene, including flashing lights, road flares, reflective vests worn by responders and vehicles parked on the road.

The agency opened the investigation in August, citing 12 crashes in which Teslas on Autopilot hit parked police and fire vehicles. In the crashes under investigation, at least 17 people were hurt and one was killed.

NHTSA announced the investigation into Tesla’s driver assist systems including Autopilot and/or Traffic Aware Cruise Control after a series of collisions with emergency vehicles since 2018. The probe covers 765,000 vehicles from the 2014 through 2021 model years.

Autopilot, which can keep vehicles in their lanes and stop for obstacles in front of them, has frequently been misused by Tesla drivers.

The agency also is asking Tesla for details on how it ensures that drivers are paying attention, including instrument panel and aural warnings.

Tesla also faces another deadline from NHTSA. By Nov. 1 it has to explain why an over-the-internet software update improving Autopilot's ability to spot emergency vehicles in low-light conditions should not be considered a recall.

Also, a NHTSA spokeswoman said Monday that the agency has asked Tesla for information about changes to “Full Self-Driving” software that is being tested on public roads by selected Tesla owners.

Musk wrote on Twitter during the weekend that Tesla was spotting “issues” with a new version of the software, so it was rolling that back to a previous version. Earlier he wrote that the new version was experiencing “regression” in left turns at traffic lights.

On Monday, Musk tweeted that the problem had been fixed and said the issue was power-saving mode interacting with the software.

Critics say the changes show Tesla is testing software on public roads without proper simulation and internal checks.

More in Training & Development