Tesla’s “Self-Driving” Dream Hits Another Red Light
Tesla’s bold vision for cars that drive themselves has once again come under federal scrutiny.
The U.S. National Highway Traffic Safety Administration (NHTSA) has opened a new investigation into Tesla’s Full Self-Driving (FSD) software after receiving more than 50 reports of vehicles allegedly running red lights or drifting into wrong lanes, four of which reportedly led to injuries.
The move marks one of the first investigations aimed directly at Tesla’s most advanced driver-assistance system, which CEO Elon Musk has promoted as the company’s gateway to fully autonomous robotaxis.
What triggered the investigation
According to the agency’s Office of Defects Investigation (ODI), some drivers claimed their Teslas with FSD failed to stop at red lights or turned into opposing lanes, sometimes at the same intersection in Maryland.
NHTSA confirmed that Tesla has already “taken action” to fix that specific location issue, but the broader probe continues.
The investigation began shortly after Tesla rolled out the latest FSD update, which includes training data collected from its robotaxi pilot in Austin, Texas.
Tesla’s safety record so far
This isn’t Tesla’s first brush with regulators. Earlier this year, the NHTSA closed an investigation into Tesla’s Autopilot system after linking it to 13 fatal crashes caused by misuse.
However, the agency is still reviewing whether Tesla’s software fixes are working as intended.
Now, with FSD under a fresh spotlight, the company faces a tougher test, whether its autonomous tech can meet real-world safety standards.
The bigger question: Is AI ready to drive?
Tesla markets FSD as a feature that can handle city streets, intersections, and traffic lights with minimal human input.
But safety experts argue that the system still requires constant driver attention.
As one analyst noted, “Tesla is pushing software forward faster than regulators can catch up.”
For Tesla, the investigation isn’t just about compliance, it’s about trust.
The company’s future robotaxi plans depend on convincing both customers and regulators that AI can truly see and react like a human driver.
What happens next
The NHTSA’s probe is still in its Preliminary Evaluation phase, the first step before any potential recall.
Such evaluations typically take around eight months to complete, though delays are possible due to ongoing government resource cuts.
Meanwhile, Tesla continues to collect real-world data from FSD users across the U.S. – data Musk says will help “train safer autonomous behavior at scale.”
Self-driving technology has come a long way from science fiction, but this latest investigation proves one thing :
innovation can’t outpace accountability.
Whether Tesla’s Full Self-Driving passes this safety test or not, it’s clear that the road to true autonomy still has a few red lights left.
Also Read: What Is an AI Agent in Simple Words — Explained Simply.

