The NHTSA Investigation into Tesla FSD and What It Means for Owners

Hey Tesla lovers — buckle up. There’s breaking news that affects nearly 2.9 million Tesla vehicles with Full Self-Driving (FSD). The U.S. safety regulator, NHTSA (National Highway Traffic Safety Administration), has opened a preliminary investigation into traffic law violations tied to Tesla’s FSD system. We’re talking “running red lights,” unsafe lane changes, entering intersections on red — things nobody wants happening, especially when the car is supposed to help.

As someone who’s excited about what Tesla is doing, I believe this matters a lot not just for safety — but for the credibility of FSD, investor confidence, your car’s resale value, insurance, and how Tesla evolves from here. In this article, I’ll give you a deep dive: what’s happening, what evidence there is, the technical and regulatory implications, what you as an owner should be aware of, and what might come next.

Here’s the roadmap:

  1. Background: What is FSD, and how did we get here

  2. Details of the NHTSA probe: scope, claims, crashes, and behavior

  3. Potential root causes: what may be going wrong under the hood

  4. Risks & consequences for Tesla owners and the community

  5. The broader regulatory & industry impact, U.S. vs Europe

  6. What you should watch, what you might do now

  7. Conclusion & forecast

  8. FAQ

1. Background on Tesla’s FSD / Autopilot architecture

1.1 Evolution from Autopilot to FSD

Tesla’s journey from Autopilot to Full Self-Driving has always been ambitious. Autopilot started as advanced driver-assist features: lane keeping, adaptive cruise control, etc. Over time Tesla added features: Navigate on Autopilot (lane changes on freeway, suggestions of exit), Traffic Light and Stop Sign Recognition, etc. FSD is the more premium offering — advertised to eventually handle city streets, make turns, stop at lights, all with minimal input (though Tesla always states the driver must remain alert).

Over the years, Tesla has pushed the envelope with frequent software updates (“over the air”, OTA), using data from its vast fleet to improve recognition, behavior in corners, etc. Many fans — me included — have followed every version update, noting incremental improvements, strange edge-case failures, and occasional blunders.

1.2 Tesla’s claims vs reality — driver-assist, not full autonomy

It’s important to always be clear: despite the name, FSD is not fully autonomous. Tesla’s official disclaimers repeatedly emphasize that the driver must remain attentive, keep hands on the wheel, intervene when needed.

In practice, though, many users treat FSD like it’s more capable than it is — and some experience misleading situations. Because AI / perception systems are probabilistic, failure modes show up in rare or complex scenarios (light ambiguity, glare, unusual signals, intersections, weather). Those failure modes are precisely what regulators are watching.

1.3 Known prior scrutiny and investigations

This is not Tesla’s first time under the microscope. Some prior investigations include:

  • Crashes under reduced visibility (fog, glare, dust) where FSD or Autopilot was engaged. 

  • Allegations that Tesla delayed crash reporting to NHTSA. 

  • Previous safety complaints about recognizing stop signs, stop lights, lane markings.

  • Earlier NHTSA audits about how well Tesla complies with its duty to report incidents involving advanced driver assistance systems.

These build the context: this new probe doesn’t come from nowhere.

2. The new NHTSA probe: scope and allegations

Now let’s get specific about what the probe announced on October 9, 2025 is all about. This is serious, so take note.

2.1 Who, what, when — number of vehicles, triggers

  • NHTSA is investigating approximately 2.88 million Tesla vehicles equipped with FSD. 

  • It’s a preliminary evaluation (first step) that could lead to a recall if the system is found to create an unreasonable risk. 

  • The trigger: more than 50 reported traffic‐law-violating behaviors and about 14 crashes that resulted in 23 injuries.

2.2 Key Alleged Violations

The most concerning allegations include:

  • Vehicles using FSD running red traffic lights — entering intersections when the light is red. 

  • Vehicles making unsafe lane changes, sometimes into oncoming traffic or changing lanes in ways that violate the proper direction. 

  • FSD failures to stop properly or remain stopped at red lights: e.g. failing to fully stop, or misrecognizing the signal state. 

  • Cases where drivers were not alerted properly ahead of intersections approaching red lights, or where FSD did not provide sufficient warnings. 

2.3 Crashes, injuries, and examples

  • The investigation includes 14 crashes, of which 23 people were injured.

  • At least six reports where the Tesla, with FSD engaged, “approached an intersection with a red traffic signal, continued into the intersection against the red light, and was subsequently involved in a crash with other motor vehicles.”

  • Four of those crashes resulted in injuries.

  • There are multiple consumer complaints and media reports alleging that FSD failed to detect the correct traffic signal state, or display it properly, or warn the driver. 

2.4 Regulatory procedural status: what can happen next

  • Right now it’s a preliminary evaluation by NHTSA’s Office of Defects Investigation. If evidence shows an unreasonable risk, the agency can require Tesla to recall affected software or hardware, or impose other remedial actions.

  • Tesla will presumably respond in writing with data, logs, software versions, possibly propose fixes via over-the-air updates.

  • This investigation adds to existing scrutiny, including earlier probes about visibility conditions and crash reporting delays. 

3. Technical root causes and failure modes

As a Tesla enthusiast, I always geek out over the technical bits. Let’s explore what might be causing FSD to misbehave in these particular ways.

3.1 Sensor / perception limitations

  • Tesla’s perception stack is heavily camera-based, with some radar (depending on model). Cameras are great, but traffic signals, especially when partially obscured, backlit, have glare, different shapes/styles, or when signal heads are tilted or occluded, are hard to interpret under all conditions.

  • Lighting conditions: dusk/dawn, glare from sun, shadows, wet pavement reflection could confuse detection of red vs green.

  • Infrastructure variability: traffic signals in the U.S. have lots of variation in placement, visibility; sometimes redundant signals (on poles, hanging wires, mast arms) complicate the view.

3.2 Decision / control logic flaws

  • Misclassification of signal state: the software either misinterprets red as yellow or green, or fails to detect red at all in some frames.

  • Timing of recognition vs decision: even if detection is correct, latency (processing time) may cause the car to proceed because it thought the light was turning green, or misjudge the time needed to stop.

  • Lane change algorithms: deciding standard vs aggressive lane change, dealing with oncoming traffic, merging, and obeying “direction of travel” rules are tricky.

3.3 Edge case scenarios

  • Approaching an intersection where the light is just turning red: is the car allowed to cross or must it stop? The boundary cases are ambiguous.

  • Red signals that are partially blocked by trees, signage, or in tunnels or under bridges.

  • Inconsistent or outdated traffic infrastructure: signal heads broken, misaligned, or electrical issues.

  • Unexpected behavior from other drivers or sudden obstruction may force FSD into unsafely reacting.

3.4 Tesla’s mitigation strategies

  • Over-the-air updates: Tesla often pushes software versions that aim to improve traffic signal recognition, intersection behavior, braking/timing.

  • Enhanced driver alerts: requiring hands on wheel, alert reminders, steering torque feedback, etc.

  • Data collection: Tesla’s fleet generates a lot of visual / sensory data, which can be used to retrain the neural networks to better distinguish signal states.

  • Internal logs: incidence reporting by owners helps, and Tesla likely is monitoring flagged videos via the “Send video / telemetry” tools.

4. Risk & liability implications for Tesla owners

As someone who might own an FSD-enabled Tesla, or thinking about it, you need to know what this means for you in real life.

4.1 Legal liability

  • If your Tesla, with FSD enabled, enters an intersection on red or does unsafe lane changes and causes a crash, there may be legal liability for you (the driver) because Tesla explicitly requires driver supervision. Courts in many states might still assign primary responsibility to driver unless Tesla is proven negligent.

  • But if this probe discovers systematic flaws, owners might be part of class actions, or Tesla might be compelled to issue recalls or software patches.

4.2 Insurance implications

  • Insurance companies may raise premiums for FSD-equipped Teslas, or create special risk categories, especially in states where few regulatory constraints exist.

  • If crash data shows repeated red-light violations tied to FSD, insurers might classify such vehicles as higher risk. That could cost you in renewals, resale value.

4.3 Behavioral precautions

  • Until the probe resolves, always remain vigilant: hands on wheel, eyes on road, be ready to override. Don’t treat FSD like autopilot in sci-fi movies.

  • Test behavior in familiar routes; avoid relying blindly when approaching intersections, or in complex urban roads where signals are less standard.

  • Use video logging (Dashcam, Tesla’s in-car logs) to capture anomalies; report problems through Tesla’s channels.

4.4 Recall / software disablement potential

  • If NHTSA decides there is an “unreasonable risk,” it may order a recall or require Tesla to disable certain behaviors or limit FSD usage in some contexts until fixes are in place.

  • Tesla might push software patches to address violations; in worst case, certain functions may be restricted or mitigated (e.g. restricting lane changes or intersection crossing when signal detection confidence is low).

5. Broader regulatory and industry impact

What’s happening here is part of a larger trend. For those of us watching Tesla as a pioneer in autonomous driving and ADAS, this probe could set important precedents.

5.1 Implications for autonomous vehicle legislation / standards

  • The definitions of what “driver-assist” systems must do, what is allowed vs what counts as an unsafe automated behavior, are under pressure. Laws may need updating.

  • Regulation around “when is driver required to intervene” vs “what is responsibility of manufacturer” will probably get tighter.

5.2 U.S. vs European regulators: comparisons

  • Europe (EU, UK, etc.) tends to have stricter safety / vehicle type approval standards, sometimes more aggressive public safety mandates. If U.S. regulators force recalls or restrictions, EU will likely have similar responses, or may preemptively audit or ban certain FSD behaviors.

  • European drivers may already have stricter legal exposure for traffic law violations; insurance and liability norms differ.

5.3 Impact on Tesla’s training of full autonomy / robotaxi plans

  • Tesla has long touted robotaxi ambitions and scaling up full autonomy. But safety failures of this kind undermine credibility.

  • Investors, public perception, legal risks may slow down rollout, possibly requiring more cautious development, or modifying what features are released to the public.

5.4 Reputational impact & investor confidence

  • News like this tends to dent confidence — both from consumers (some will be wary) and investors (risk of liability, recalls, regulatory fines).

  • Tesla’s stock already dropped slightly after the announcement. 

  • How Tesla responds — swiftly, transparently, with visible improvements — may determine whether this becomes a small bump or something more serious.

6. What owners in the U.S. & Europe should watch / do

If you own a Tesla with FSD (or are thinking of getting one), here are practical steps and things to keep an eye on.

6.1 Tracking official updates

  • Watch for any recall notices from Tesla or NHTSA. Tesla may publish software release notes that specifically reference intersection / signal handling improvements.

  • Monitor communications from NHTSA (their website, public filings), and Tesla’s regulatory filings.

6.2 Best practices with FSD usage

  • In the meantime, always be ready to intervene: when approaching intersections, red lights, or when signals are unclear. Reduce speed if the view is obscured.

  • If in unfamiliar intersection signal infrastructure, take over control earlier. Use manual mode in tricky areas. Don’t over-trust the system.

6.3 Reporting issues

  • Whenever you see FSD misbehaving, record video if safe, report via Tesla’s in-car feedback or customer service, possibly via NHTSA’s complaint system. These reports help.

  • Check online communities (forums, owners groups) — often early signals of pattern show up there.

6.4 Impacts on resale, insurance, regulatory compliance

  • Be aware that FSD’s value could decline if perception of risk increases, or if insurers begin charging more for FSD-equipped cars.

  • If you plan to sell, document your car’s software version history, update logs, and whether you followed best practices. That may help prospective buyers feel more confident.

Conclusion

So what do I make of all this? As a Tesla fan, I remain optimistic — because Tesla has done incredible things with OTA updates, fleet learning, and pushing ADAS forward. But this probe is a wake-up: having a huge installed base of FSD vehicles means even rare errors become significant in aggregate. Running red lights and unsafe lane changes are not small glitches; these are safety edge cases that must be addressed rigorously.

I expect Tesla will respond with software improvements, possibly temporary restrictions in certain behaviors (especially intersection crossings or complex lane changes) until signal-recognition is more robust. Depending on what NHTSA concludes, there may be recalls, legal consequences, or at least mandatory disclosures.

For us owners: stay alert, stay informed, don’t treat FSD as magic. Use it, but respect its limits. That balance is essential if FSD is going to live up to its promise — safer roads, less driver fatigue, new mobility.

FAQ

Q: Does this mean FSD will be disabled on my car?
A: Unlikely in full, unless a specific mode or behavior is found to be unsafe. More probable are software patches, alerts, or behavior restrictions. Tesla usually pushes updates rather than disabling entire features.

Q: Are European Tesla owners affected by this U.S. probe?
A: Not legally by U.S. regulations, but yes in influence. If systemic issues are identified, Europe’s regulators may open similar investigations. Plus, Tesla’s software is often similar globally, so fixes or changes may propagate.

Q: What types of accidents have been reported, and what is Tesla’s responsibility?
A: The reported cases include crashes at intersections where Teslas allegedly ran red lights with FSD engaged; some caused injuries. Tesla’s responsibility will depend on whether system faults can be shown (signal misrecognition, behavior logic, warnings), and whether driver supervision was present.

Q: Should I stop using FSD until this is resolved?
A: That depends on your risk tolerance. If you feel uncomfortable, or drive often through complex intersections with variable signals, you might dial it back. Otherwise, stay attentive and intervene when needed.

Q: How can I know if my Tesla will be subject to a recall or software update?
A: Keep an eye on official notices from Tesla and NHTSA, check your Tesla’s software release notes, monitor firmware updates, and Tesla’s support portal. Sometimes owners will see “release candidate” versions that mention signal-detection improvements or intersection behavior fixes.

Tilbage til blog
0 kommentarer
skriv en komment
Vær opmærksom på, at kommentarer skal godkendes, før de bliver offentliggjort

Din indkøbsvogn