Elon tweeted that v9 of the FSD beta would remove its reliance on radar completely and instead determine decisions based purely on vision. Humans don’t have radar after all, so it seems like a logical solution and tells us Tesla is feeling much more confident in their vision AI.
Radar and vision each have their advantages, but radar has thus far been much more reliable in detecting objects and determining speed. If you’ve ever noticed your Tesla being able to detect two vehicles in front of you when you can only see the one directly ahead of you, that’s radar at work.
In this situation the radio waves from the radar sensor are bouncing underneath the car in front of you and are able to continue traveling and detect that there is another object ahead even though it could never “see” it.
It really is one of those wow moments where you can feel the future and the ability for AI-powered cars to drive better than humans one day. It’s baby steps and slowly we’ll see more and more of these situations where the vehicle simply sees or does something we could never do.
There’s no doubting that more sensors could provide a more reliable and accurate interpretation of the real world as they each have their own advantages. In an ideal world a vehicle with radar, lidar, vision, ultrasonic sensors and even audio processing would provide the best solution. However, more sensors and systems come at a price, resulting in increased vehicle cost and system complexity.
After all humans are relatively safe drivers with two “cameras” and vision alone. If Tesla can completely solve vision, they’ll easily be able to achieve superhuman driving capabilities. Teslas have eight cameras, facing in all directions. They’re able to analyze all of them concurrently and make much more accurate interpretations then we ever could in the same amount of time.
Tristan on Twitter recently had some great insight into Tesla vision AI and how they’re going to replace radar. Here’s what Tristan had to say:
"We recently got some insight into how Tesla is going to replace radar in the recent firmware updates + some nifty ML model techniques
From the binaries we can see that they've added velocity and acceleration outputs. These predictions in addition to the existing xyz outputs give much of the same information that radar traditionally provides
(distance + velocity + acceleration).
For autosteer on city streets, you need to know the velocity and acceleration of cars in all directions but radar is only pointing forward. If it's accurate enough to make a left turn, radar is probably unnecessary for the most part.
How can a neural network figure out velocity and acceleration from static images you ask?
They can't!
They've recently switched to something that appears to be styled on an Recurrent Neural Network.
Net structure is unknown (LSTM?) but they're providing the net with a queue of the 15 most recent hidden states. Seems quite a bit easier to train than normal RNNs which need to learn to encode historical data and can have issues like vanishing gradients for longer time windows.
The velocity and acceleration predictions is new, by giving the last 15 frames (~1s) of data I'd expect you can train a highly accurate net to predict velocity + acceleration based off of the learned time series.
They've already been using these queue based RNNs with the normal position nets for a few months presumably to improve stability of the predictions.
This matches with the recent public statements from Tesla about new models training on video instead of static images.
To evaluate the performance compared to radar, I bet Tesla has run some feature importance techniques on the models and radar importance has probably dropped quite a bit with the new nets. See tools like https://captum.ai for more info.
I still think that radar is going to stick around for quite a while for highway usage since the current camera performance in rain and snow isn't great.
NoA often disables in mild rain. City streets might behave better since the relative rain speed is lower.
One other nifty trick they've recently added is a task to rectify the images before feeding them into the neural nets.
This is a common in classical CV applications so surprised it only popped up in the last couple of months.
This makes a lot of sense since it means that the nets don't need to learn the lens distortion. It also likely makes it a lot easier for the nets to correlate objects across multiple cameras since the movement is now much more linear.
For more background on LSTMs (Long Short-Term Memory) see https://towardsdatascience.com/illustrated-guide-to-lstms-and-gru-s-a-step-by-step-explanation-44e9eb85bf21
They're tricky to train because they need to encode history which is fed into future runs. The more times you pass the state, the more the earlier frames is diluted hence "vanishing gradients".
Tesla’s FSD beta v9 will be a big improvement forward from what FSD beta users have been using where the system was still relying on radar. And it’ll be an even bigger leap from what non-beta testers currently have access to. We can’t wait. Now where’s that button?
Almost ready with FSD Beta V9.0. Step change improvement is massive, especially for weird corner cases & bad weather. Pure vision, no radar.
Tesla has officially launched FSD in Mexico. This is the third expansion of FSD since it was first launched in the United States.
The news was shared by Tesla Owners Mexico on X, followed by confirmation from Tesla AI with a simple but exciting message: "¡Hola México!"
FSD features are geo-fenced, meaning that if a vehicle equipped with FSD crosses into a country where the software isn’t supported, it will automatically revert back to Autopilot. This expansion has likely lifted the restriction, meaning the U.S. Tesla owners should now be able to drive into Mexico and continue using FSD without interruption.
FSD Global Expansion Timeline
Tesla has set some lofty goals for itself, announcing in their FSD roadmap that it expects FSD to be ready to roll out internationally in Q1 and expand to right-hand drive markets in Q2. While it’s not clear whether Tesla still expects to meet these targets, this is the latest information that was shared in September 2024:
Q1 2025: FSD is expected to launch in Europe and China, pending regulatory approval.
Q2 2025: FSD rollout for right-hand drive (RHD) markets, with a flexible timeline based on approval processes.
Tesla has faced regulatory hurdles, particularly in China, where FSD testing was recently put on hold. One major hurdle is China’s strict data regulations, which require all training data to be collected and stored within the country. This means Tesla cannot rely on its existing U.S.-based data centers and must build local infrastructure to comply with government policies that prevent vehicle data from leaving China.
Additionally, since FSD relies heavily on fleet data, Tesla won’t be able to leverage its vast global dataset. Instead, the company will need to retrain its AI models using data exclusively gathered from vehicles operating within China. These constraints add complexity to Tesla’s FSD rollout, potentially delaying its expansion in the region.
After initially launching FSD Beta in the U.S. to a small group of influencers, Tesla expanded access to more users through its Safety Score program in late 2021. At first, only those with a perfect score of 100 were eligible, but as FSD improved and Tesla grew more confident in its performance, the requirement was gradually lowered.
In March 2022, Tesla took its first step beyond the U.S. by introducing FSD in Canada with the release of v10.11.1. Since then, Tesla has expanded FSD to Puerto Rico, and now, the launch in Mexico marks another major expansion of the software.
This news will likely raise excitement among users outside of North America who have been waiting for FSD for years. With regulatory hurdles in China presenting unique challenges, Tesla may shift its focus to expanding FSD in Europe and Oceania first.
Meanwhile, Tesla also focuses on launching its Robotaxi network, set to debut in Austin, Texas, in June. The upcoming Cybercabs in Austin could be running early builds of FSD v14 or a specialized version designed for Unsupervised FSD.
Like so many other things happening at Tesla right now, it’s an exciting time as Tesla prepares Unsupervised FSD for the Cybercab, continues the expansion of FSD, and works out all the details of its robotaxi network, such as the cleaning hubs and wirelessly charging capabilities.
In the latest episode of Jay Leno’s Garage, Tesla’s VP of Vehicle Engineering, Lars Moravy, confirmed that the new Model Y will feature adaptive headlights.
As Moravy was talking about the updated headlights in the vehicle, which now sit a few inches lower than before, he stated that in a couple of months, Tesla will add adaptive headlights in the U.S.
While Tesla has already introduced adaptive headlights in Europe and the Indo-Pacific, the feature has yet to make its way to North America.
Originally delayed in the U.S. due to regulatory issues, manufacturers have been able to implement adaptive headlights since mid-2024. Meanwhile, competitors like Rivian and Mercedes-Benz have already rolled out their own full matrix headlight systems, matching what’s available in other regions.
Update: This article has been updated to clarify that adaptive headlights will indeed be launched in the U.S., shortly after the vehicle launching in March.
Currently, Tesla in North America supports adaptive high beams and automatic headlight adjustment for curves, but full matrix functionality has yet to be rolled out. Meanwhile, matrix headlights are already available in Europe, where they selectively dim individual beam pixels to reduce glare for oncoming traffic and adapt to curves in the road.
It was surprising that matrix functionality wasn’t included in the comprehensive 2024 Tesla Holiday Update. This feature would likely improve safety ratings, so we can only assume Tesla is diligently working to secure regulatory approval.
Adaptive Headlights on Other Models
Lars didn’t confirm whether the refreshed Model Y comes with the same headlights as the new Model 3 and the Cybertruck, instead simply calling them "matrix-style” headlights.
The headlights on the new Model Y appear very similar to those available in the 2024+ Model 3, possibly meaning these other models will also receive adaptive headlight capabilities in the next couple of months.
For vehicles with older-style matrix headlights, it’s unlikely that adaptive beams support will launch at the same time, but they will hopefully become available soon afterward.