Tesla’s Camera & Weather Problem Is Serious

Support CleanTechnica’s work through a Substack subscription or on Stripe.
I’m honestly surprised it has taken this long for a serious problem to surface related to Tesla’s Full Self Driving (FSD) system, based entirely around small cameras on the sensor side, and the weather. Having had a Tesla Model 3 with Full Self Driving (FSD) for more than 6 years, I have seen problems related to the weather arise countless times.
When the sun shines too bright at a low angle, it blinds the cameras. When it’s raining hard (as it often does in Florida) and you look at the camera feeds on the touchscreen, they are basically useless. It continues to shock me and I wonder how FSD is supposed to work in such conditions. Sometimes, after it has just rained but no longer is, I back up into a driveway to park, and the remaining water on the car covers the cameras so much that I have to use other methods of looking behind me — the cameras are useless. That’s when it’s not even raining any more. So, I have wondered time after time, how is FSD’s AI software supposed to get around that problem? How is it supposed to genuinely make proper driving decisions when it cannot see through the water?
That’s even at low speeds. What about while driving fast in pouring rain? What about in heavy snow? What about heavy fog? What about dead bugs on the cameras, or other obstructions?
When I saw the news today that the NHTSA had found consistent problems with weather degrading Tesla’s camera sensor system, not recognizing it well enough or not sharing the level of degradation with drivers quickly enough, and devastating crashes resulting, I immediately thought, “How is this just now coming to light and being investigated?” Here’s how the NHTSA put it:
“Available incident data raise concerns that Tesla’s degradation detection system, both as originally deployed and later updated, fails to detect and/or warn the driver appropriately under degraded visibility conditions such as glare and airborne obscurants. In the crashes that [the Office of Defects Investigation] has reviewed, the system did not detect common roadway conditions that impaired camera visibility and/or provide alerts when camera performance had deteriorated until immediately before the crash occurred.
“Review of Tesla’s responses revealed additional crashes that occurred in similar environments and where the system either did not detect a degraded state, and/or it did not present the driver with an alert with adequate time for the driver to react. In each of these crashes, FSD also lost track of or never detected a lead vehicle in its path.”
I knew from plenty of personal experience that Tesla FSD couldn’t handle various weather conditions and wouldn’t activate or would warn me in those conditions. I didn’t know that the system was doing a poor job of recognizing this, though, and waiting too long to warn drivers or deactivate. I didn’t know it was just doing so at the last moment in many cases, when there was no longer time for the human driver to handle the situation. However, I am also not surprised. The system, like its head creator, seems to be overconfident. Built into it seems to be the assumption that it can and should be doing more than it is actually equipped to do.
Tesla’s Robotaxi program was supposed to be covering half the US population by the end of 2025. It covered 0% of the population. It’s almost April 2026 and the program isn’t showing signs that it has the ability to grow beyond a simple trial phase with human supervisors. I wonder how much of that is due to the fact that the system isn’t good enough in various weather conditions — or due to other critical problems with the system. This is only the 10,000th time a Tesla FSD/robotaxi target has been missed (hyperbole), but it seems Elon Musk is still intent on trying to shove a camera-only system forward as if the cameras aren’t disastrously compromised in the case of heavy rain or bright, low sunshine. How can the software be good enough to piece things together and drive appropriately when the cameras are obscured? I guess it’s the effort to make them do that which has led to the system failing to warn drivers or warning them too late that FSD isn’t capable of driving safely in those scenarios.
Sign up for CleanTechnica’s Weekly Substack for Zach and Scott’s in-depth analyses and high level summaries, sign up for our daily newsletter, and follow us on Google News!
Advertisement
Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.
Sign up for our daily newsletter for 15 new cleantech stories a day. Or sign up for our weekly one on top stories of the week if daily is too frequent.
CleanTechnica uses affiliate links. See our policy here.
CleanTechnica’s Comment Policy




