Federal regulators have opened a sweeping new investigation into Tesla’s “Full Self-Driving” system, the very technology Elon Musk once described as the future of mobility. But after dozens of reports of Teslas running red lights, veering into wrong lanes, and even crashing without warning, that futuristic promise is suddenly facing its toughest test yet.
The U.S. National Highway Traffic Safety Administration (NHTSA) confirmed it is examining 58 incidents in which Tesla vehicles allegedly broke traffic laws while using the company’s Full Self-Driving (FSD) mode. These incidents led to multiple crashes, fires, and injuries. The probe covers nearly 3 million cars, virtually every Tesla equipped with FSD in the United States.
Tesla insists that drivers must keep their hands ready on the wheel and eyes on the road, even while using the system. But investigators say many owners involved in accidents claimed the cars gave no warning before behaving unpredictably. That finding has renewed long-standing criticism that Tesla’s branding of “Full Self-Driving” is misleading and encourages over-trust in automation.
Also Read: Sam Altman Says He’s Jealous of Today’s Dropouts — Here’s Why.
A Turning Point for Musk’s Autonomous Vision
The investigation lands at a delicate moment for Elon Musk, who continues to promote FSD as the foundation for his upcoming Robotaxi network, fleets of fully driverless Teslas expected to operate in U.S. cities next year. Musk’s repeated deadlines and ambitious forecasts have long divided analysts; now, regulators appear to be demanding proof that the software is truly ready for real-world streets.
“The ultimate question is whether the software actually works,” said one market analyst following the probe. Another investor put it more bluntly: “The world has become Elon’s open-road experiment, and it’s starting to show cracks.”
Tesla’s stock briefly dipped after news of the probe, reflecting investor concern that the company’s self-driving bet could face delays or tougher oversight.
This isn’t the first time NHTSA has scrutinized Tesla’s driver-assist tech. Earlier reviews covered collisions in fog, low-visibility conditions, and misuse of the company’s “Autopilot Summon” feature. But this new probe is broader and more aggressive, signaling that U.S. regulators may no longer accept incremental software tweaks as sufficient proof of safety.
The outcome could shape the entire autonomous-vehicle industry. If Tesla is forced to redesign or rebrand FSD, other companies developing AI-driven transport, from Waymo to Cruise, may face stricter standards as well. For Tesla, the risk is existential: its car-sales margins are shrinking, and success in autonomy is key to keeping Musk’s “driverless future” narrative alive.
After years of bold promises and high-profile demos, Tesla’s self-driving journey is hitting an unavoidable question, not when cars will drive themselves, but how safely they can do it.
The technology might be advanced, but public trust will decide whether it ever truly takes the wheel.
Also Read: Tesla’s Robotaxi Looks Cool, But Waymo Quietly Wins.

