Tesla’s Full Self-Driving Under Investigation by NHTSA

Fatal Pedestrian Crash and Fog Issues Lead to Federal Inquiry
Inside a CyberTruck
The probe marks a potentially major setback to CEO Elon Musk’s efforts to position Tesla as a leader in automated driving. (Nathan Laine/Bloomberg News)

[Stay on top of transportation news: Get TTNews in your inbox.]

The U.S. has opened a federal investigation into whether Tesla Inc.’s partial-automation system marketed as Full Self-Driving is defective after a series of crashes, one of which resulted in a fatality.

The National Highway Traffic Safety Administration said Oct. 18 it will assess whether Tesla’s system, also known as FSD, has the ability to detect and appropriately respond to fog and other reduced visibility conditions. The agency said four crashes have been reported in such scenarios where FSD was engaged.

In one of those crashes, the Tesla vehicle fatally struck a pedestrian, and another collision resulted in a reported injury, according to NHTSA. Tesla representatives didn’t respond to an emailed request for comment.



The probe marks a potentially major setback to CEO Elon Musk’s efforts to position Tesla as a leader in automated driving. The company staged an event at a Los Angeles-area movie studio just last week with autonomous vehicle concepts. It has for years charged consumers thousands of dollars for FSD, which requires constant driver supervision.

 

See more transportation stock listings

Tesla shares fell as much as 1% before the start of regular trading. The stock sold off after last week’s unveiling of robotaxi prototypes, which lacked details on how the company will realize its CEO’s self-driving ambitions.

The defect investigation comes on top of a recall query NHTSA opened in April into whether Tesla had done enough to keep drivers from misusing another set of assistance features marketed as Autopilot. The agency is looking into whether a software update Tesla deployed late last year ensures that drivers stay engaged while using the system.

NHTSA has said there’s been “a critical safety gap” between what drivers think Autopilot can do and its actual capabilities. That gap has led to foreseeable misuse of the system and avoidable crashes, according to the agency.

RELATEDElon Musk Unveils Tesla’s Cybercab

In April, NHTSA said it had found 211 incidents in which Teslas crashed on Autopilot, despite there being enough time for drivers to avoid or mitigate collisions. In 111 cases, drivers went off roadways after inadvertently disengaging the system.

Musk has said Tesla’s ability to develop autonomous-vehicle technology ultimately will determine whether the company is worth lots of money or “basically zero.” The automaker charges $8,000 for FSD.

Want more news? Listen to today's daily briefing below or go here for more info: