US Auto Safety Probe Targets Tesla’s Full Self-Driving System
The U.S. National Highway Traffic Safety Administration (NHTSA) has launched a significant investigation into nearly 2.9 million Tesla vehicles equipped with the Full Self-Driving (FSD) technology. This probe follows over 50 reports where Tesla cars allegedly ran red lights, made unsafe lane changes into oncoming traffic, and violated traffic safety laws, resulting in multiple crashes and injuries.
Scope of Investigation and Safety Concerns
NHTSA identified 58 incidents linked to FSD, including 14 crashes causing 23 injuries. Six reports specifically detailed cases where Tesla vehicles approached intersections at red lights, continued through against the signal, and collided with other motor vehicles. A repeating pattern of violations was noted at the same intersection in Joppa, Maryland, prompting Tesla to issue a software update to address it. The agency is examining whether Tesla adequately warned drivers of FSD’s behavior and if they had sufficient time to intervene.
Background on Tesla’s Safety Scrutiny
This is not the first time Tesla’s driver-assistance systems have faced regulatory challenges. In October 2024, NHTSA opened a probe into 2.4 million Tesla vehicles after crashes in poor visibility conditions, including a fatal accident in Arizona. Tesla previously recalled about 2 million vehicles in December 2023 over Autopilot safety concerns and recalled over 360,000 vehicles in early 2023 due to issues with FSD Beta software.
Potential Recall and Regulatory Impact
The current preliminary evaluation by NHTSA could lead to a recall if Tesla’s system is deemed to pose an unreasonable safety risk. Tesla has released a recent software update for FSD but has not publicly commented on the investigation. The probe also reflects growing congressional scrutiny of Tesla’s advanced driver-assistance technology, coinciding with the arrival of a new NHTSA administrator.
Tesla’s Position and Technology Limits
Tesla markets FSD as an assistance system requiring driver supervision and intervention, stating it can navigate "nearly anywhere" with minimal driver input. However, drivers and regulators highlight critical shortcomings, such as failure to recognize traffic signals properly, inducing risky behaviors like running red lights and entering oncoming lanes. These gaps have led to lawsuits, regulatory scrutiny, and calls for stronger safety measures.
This investigation highlights the ongoing tensions between cutting-edge vehicle automation and ensuring public safety. While Tesla pushes toward greater autonomy, regulatory bodies remain vigilant, emphasizing that driver-assistance systems must operate reliably to prevent accidents and injuries.
