Tesla recently launched FSD in China, which has led many people to wonder exactly how they did it so quickly. Tesla isn't allowed to send training data out of China, meaning that it can’t leverage the capacity of the new Cortex Supercomputer Cluster at Giga Texas.
Instead, Tesla is using their generalized model, in combination with Synthetic Training Data, to train FSD for China. Of course, Tesla also uses this same synthetic data to supplement training for North America and for training for Europe. With European FSD on the horizon, we’ll likely see more and more use of synthetic training data for a sure-fire means to handle edge cases.
Simulated Content
Tesla officially refers to the synthetic training data as “Simulated Content” throughout their patent, which is titled “Vision-Based System Training with Synthetic Content.” Let’s break it down into easier-to-understand chunks.
Vision-Only Training
As you may well know, Tesla’s approach to autonomy focuses on using Tesla Vision. That means cameras providing visual data are the primary - and really only - means of acquiring data from outside of the vehicle. They no longer use radar and only use LiDAR to ensure vision sensor accuracy during training.
Capturing all the information from around the car builds a 3D environment that the vehicle uses to plan its path and conduct its decision-making. All that data is processed to build a fairly comprehensive view of what is around the vehicle and what is predicted to be around the vehicle in the future. All of that is also tagged and characterized to help the system prioritize various decisions.
Supervised Learning Model
Tesla’s FSD training is done through a supervised learning model. That means that the training model is fed data that is already labeled, either by humans or by Tesla’s unique AI model. The objects in the images that are being fed are identified and also tagged with position, velocity, and acceleration. This information acts as a ground truth for the AI model to learn from, allowing it to recognize and interpret similar objects and situations when encountered in real-world driving.
Ground Truth Label Data
The ground truth label data is a critical portion of this supervised learning process. The labeled data provides the model with accurate information about objects and their characteristics in the images. This enables Tesla to develop FSD’s robust understanding of the environment around it while it's driving. This data is typically collected from real-world driving scenarios and is either manually or automatically annotated with data.
Generating Simulated Content
Supplementing the real-world ground truth label data, Tesla employs a simulated content system to generate synthetic training data - which is really the key portion of this patent. This system generates synthetic training data that closely resembles the labeled ground truth data from above.
Content Model Attributes and Contextual Labeling
The generation of that simulated content is guided by what Tesla calls “content model attributes,” which are essentially the key characteristics or features that are extracted from the ground truth label data. These could include things like road edges, lane lines, stationary objects, or even dynamic objects like vehicles or pedestrians.
By varying these attributes, the system can create a wide array of simulated scenarios - which means that FSD’s training program is exposed to as many unique and normal situations as possible.
In addition to the attributes, the system also incorporates contextual labeling - which involves adding labels to the simulated content to help refine it with even more detail. These labels can include things like weather conditions, time of day, or even the type of road or environment the vehicle is driving in. All this information is useful context to help develop FSD’s understanding of driving environments.
Training Data Generation
Tesla’s simulated content system generates vast amounts of training data by creating variations of the content models. These variations generally involve making tweaks to the attributes of the objects in the scene - thereby changing environmental conditions, or introducing new types of driving scenarios, like heavy traffic or construction.
Training FSD
Wrapping it all up - the combined dataset of both real-world data and simulated data is then used to train FSD. By continuously providing new sets of both types of input, Tesla can continue to refine and improve FSD further.
Why Use Simulated Content?
It might seem counterintuitive that Tesla utilizes simulated content for training their autonomous driving system when their vehicles already collect vast amounts of real-world driving data. Their vehicles drive hundreds of millions of miles a month, all across the globe - providing them access to an unfathomable amount of unique data. Well, there are a few reasons to do so.
Not a Tesla App
Cost Reduction
One of the primary advantages of using simulated content is cost reduction. By not having to collect, transmit, sort, label, and process the incoming data from the real world, Tesla can instead just create data locally.
That cuts costs for data transmission, data storage, and all the processing and labeling - whether by human or machine. That can be a fairly significant amount when you think about just how much data goes through Tesla’s servers every single day from vehicles all around the world.
Simulating Challenging Conditions
Simulated content allows Tesla to train FSD in a wide range of environmental conditions that might be rare, difficult, or even dangerous to encounter consistently in real-world driving. This can include challenging conditions like heavy rain, fog, or snow - or even nighttime driving in those conditions.
By training the system on this type of content without trying to pull it from real vehicles, Tesla can ensure that FSD remains operable and fairly robust even in more difficult scenarios in the real world.
Edge Cases & Safety
Another crucial benefit of simulated content is the ability to train FSD on edge cases. While we sometimes jokingly refer to edge cases as things like stopping for a school bus, there are real edge cases that may not be frequently encountered in real-world driving scenarios but can pose real safety risks for drivers, occupants, pedestrians, or other road users. Think of things that you could see happening but have never actually seen, like a car falling off a transport trailer or a highway sign falling down.
As such, Tesla simulates many unique edge cases, including sudden pedestrian crossing, unexpected obstacles in the road, or even erratic behavior from other drivers. All these unique simulations are fairly hard to capture regularly in the real world, which means simulating and training on them is essential to ensure safety.
Efficient and Continuous Optimization
Finally, the vast amount of diverse training data that can be generated by Tesla on demand means that they can quickly and efficiently iterate on FSD without needing to wait for real-world data. This means they can keep a continuous learning process going, ensuring that FSD is always improving bit by bit.
If you’re interested in reading more about the guts that make FSD tick, check out our entire series on FSD-related patents from Tesla here.
We’d also recommend our deep dive into Nvidia’s Cosmos - which is a training system for autonomous vehicles that primarily uses synthetic data to train machine models. It's a different take on Tesla’s FSD training cycle that primarily relies on real data, but it does have some similarities to this particular means of using simulated content.
Subscribe
Subscribe to our newsletter to stay up to date on the latest Tesla news, upcoming features and software updates.
Tesla has just announced the contents and features of its 2025 Spring Update. There’s a lot of new content that we expected, as well as some stuff we didn’t see coming that will be arriving in Tesla’s next major release. Awesome new features, such as Adaptive Matrix High Beams, will finally become available in North America, while others like Grok’s voice assistant aren’t quite ready yet.
So, without further ado, let’s get cracking and take a look at everything in this awesome update.
Adaptive High Beams
The headliner feature of this update is the much-awaited Adaptive High Beams for North America - specifically the United States and Canada. We’ve been waiting a little over a year since it was launched in Europe last year. Tesla faced some regulatory delays in getting this approved, but it’s finally arriving for vehicles with newer headlights.
Adaptive High Beams reduce glare for traffic ahead of you by individually dimming specific pixels on the LED matrix. The feature shipped with the refreshed Model Y first and is now arriving for all other vehicles with matrix headlights. This includes newer Model S, Model 3, Model X, and Model Y vehicles - but not the Cybertruck.
The adaptive headlights in action.
Not a Tesla App
The Cybertruck’s signature headlights are too small to fit the LED matrix, and as such, this feature won’t be supported on the Cybertruck for the time being. Hopefully, Tesla will figure something out, but given that this is a hardware limitation, we don’t expect to see much here.
You can check out our guide on how to determine whether your vehicle is equipped with Matrix Headlights. If your vehicle has the hardware, you will see an Adaptive Headlights option under Controls > Lights > Adaptive Headlights after receiving the Spring Update. This feature will be enabled by default.
Improved Blind Spot Camera for Model S / X
The new blindspot camera in the driver's instrument cluster.
Not a Tesla App
In a surprise addition, Tesla is improving support for the Blind Spot Camera on the 2021+ refreshes of their flagship vehicles. Previously, the blind spot camera on these vehicles would only appear on the primary infotainment screen, not the driver’s instrument panel, which was essentially copied over from the Model 3/Y.
Now, drivers will have the option to choose which display the blind spot camera appears on. A setting under Controls > Display > Automatic Blind Spot Camera will allow drivers to choose “Driver Screen”, so that the blind spot camera appears to the left or right side of the instrument cluster, depending on which turn signal you activate. For these vehicles with an instrument cluster directly in front of the driver, this is a much better implementation of the feature than how it was originally designed.
Dashcam Update - B Pillar Cameras
As part of a much-requested update, given the increased and misguided vandalism against Tesla vehicles, Tesla’s team has finally updated their software to record the B-pillar (upper side) cameras as part of both Dashcam and Sentry Mode.
While this means that Dashcam and Sentry Mode footage will now likely take up more room on your USB drive due to recording two additional cameras, it also means that your vehicle is much better protected. Dashcam and Sentry Mode now record from every camera except for the additional front-facing cameras and the interior camera.
Note: It looks like this feature will be limited to newer vehicles, likely those with AI4.
Improved Dashcam Viewer
The updated dashcam viewer.
Not a Tesla App
The Dashcam Viewer in the vehicle is also being improved with this update. Taking a page from the Tesla app, the app in the vehicle will now display multiple camera feeds at the same time, with users having the option to focus on an individual feed if desired.
Due to the additional cameras being recorded, Tesla is now laying out all the camera feeds along the bottom, instead of at each corner of the screen.
The new UI also reveals that there will be buttons to jump back or forward in 15-second increments, while at the top right, you’ll have a link to the next video, instead of having to go back to the list of videos.
Requirements for Dashcam and Sentry Mode Updates
Unfortunately, there is some bad news regarding compatibility with the B-pillar camera recording and this improved Dashcam Viewer. Tesla says the Dashcam updates will only apply to newer “S3XY” vehicles, but they don’t specify the exact requirement.
Based on previous Tesla posts, where they usually list if a feature requires the AMD Ryzen infotainment processor, this requirement doesn’t sound like an Intel vs AMD issue, but instead one that relies on AI4 hardware, which is responsible for processing the video feeds.
Tesla’s “S3XY” requirement also leaves out the Cybertruck, but this seems like an oversight. Given some previously leaked footage of this feature, we expect the Cybertruck to also receive this feature with the Spring Update.
Updated Routing Options — Avoid Highways
We recently covered routing options on the site, and we believe a lot of people will be pleased with these additions, so if you’ve been craving improved routing options, keep reading.
There are three new routing options to check out. Users will now be able to pick from three types of routing options when choosing a destination. We originally saw these as part of the navigation source code discovered in December 2024.
Fastest: This offers the quickest path to the destination, ignoring any attempts at efficiency or stopping more often to do short charges.
Best Amenities & Fewer Stops:This routing mode minimizes your charge stops in exchange for making them longer, but also allows you to stop near highly rated restaurants, shops, and restrooms for a more relaxing trip.
Avoid Highways: This much-requested feature will enable you to keep your navigation routing away from highways unless they are absolutely required to reach your destination. Hurray for the country roads and relaxed driving.
Trunk Height Based on Location
Another neat and useful little feature: you will now be able to save your trunk opening height based on location rather than applying a general maximum trunk height. If you didn’t already know, you could set the maximum height your automated trunk opens, which can help prevent it from hitting a lower garage ceiling.
This feature is already available on the refreshed Model Y but is now coming to all Model Ys, all Model 3s with automated trunks, and the 2021+ Model S and Model X.
In order to set your height, manually adjust the liftgate to your preferred opening height, and then press and hold the trunk button until you hear a chime in the vehicle, indicating that the height for this location has been set.
Save Frunk Height - Cybertruck
Tesla didn’t forget about the Cybertruck either - you can now do the same with the opening height for the Cybertruck as well. You’ll have to press the exterior (below the bottom center) frunk button and hold it until you hear a chime for the Cybertruck. Pressing the in-frunk button will simply close the frunk.
Accessory Power Option Enables 12V Sockets
Tesla is finally re-enabling 12V accessory power sockets throughout its cars with a new “Accessory Power” option, enabling anyone to use the 12V power sockets in Tesla’s vehicle lineup when they’re away from their vehicle, without needing Camp Mode. This also applies to the USB ports and wireless phone chargers throughout the vehicle.
The Model Y and Model X include a 12V socket in the rear left pillar of the vehicle, alongside a 12V socket in the front of the vehicle. The Model 3 and Model S only have a 12V socket in the front of the vehicle.
You can turn this feature on by going to Controls > Charging > Keep Accessory Power On. This feature is disabled by default and is turned off once the vehicle battery drops to 20% or below. Tesla warns that this feature will use additional power, so it’s best to only use it when needed.
Comfort Drive Mode on the Cybertruck
Following the recent addition of the Comfort Mode option in the Model 3, Tesla is adding the feature to the Cybertruck as well. This feature will automatically switch the vehicle dynamics to “Comfort”, which includes a higher ride height, softer suspension and steering response, and reduction in acceleration profile to Chill Mode while FSD or TACC are active.
You can enable or disable this feature from Controls > Autopilot > Use Comfort Mode in Autopilot. This feature will be enabled by default.
Lane Departure Avoidance on the Cybertruck
Interestingly, the Cybertruck launched without several Autopilot safety and assistance features - namely, because Basic Autopilot itself is missing from the Cybertruck - only FSD and TACC are available. As part of an improvement to safety, Lane Departure Avoidance has now arrived on the Cybertruck with the Spring Update.
This will show a blue indicator on the screen if you begin or are about to begin crossing a lane marking. You will have three options, just like with other Tesla vehicles, including None, Warning, and Assistance. Assistance will provide active feedback and move the vehicle back into the lane lines, while the warning will sound an audio tone and provide visual and physical feedback (vibration) to the steering wheel.
This feature will be enabled by default with Assistance selected and can be changed from Controls > Autopilot > Lane Departure Avoidance.
Minor Updates
Tesla also lists some other smaller details that will be included as part of the 2025 Spring Update, which include these features below:
Keyboard Languages
Go to Controls > Display > Keyboards to switch languages on the touchscreen keyboard.
Media search results are filtered by sources, which provides faster access to your content.
You can now shuffle an entire Apple Music playlist that contains more than 100 songs!
You can scroll through SiriusXM favorites by tapping the left steering wheel button left or right, similar to other services.
You can now sign in to Amazon Music with an Amazon Music Free account. You still require Premium Connectivity or WiFi to stream music.
YouTube Music now shows what song will play next in the Up Next view of the media player.
If you normally connect your vehicle to your phone’s hotspot, this feature will now be enabled every time you drive instead of having you manually connect it each time.
Features We’re Hoping Come Soon
This was an awesome update, but there are always more features we’d love to see come next. Here’s our short list of features we’re still waiting and hoping for.
Everyone’s favorite question is always, When will it be released? Well, it looks like soon. We haven’t seen any vehicles receive the Spring Update just yet, including employees. However, given that Tesla has officially announced the update, we expect it to go out to employees as soon as this weekend.
If no major issues are found, we could see it start rolling out to the lucky first customers in about a week, but be prepared for a slightly longer wait if Tesla needs to reduce multiple revisions of the update before rolling it out publicly.
Tesla’s factories are more than just gigantic locations to produce cars at an insanely fast rate - they’re also locations that live and thrive alongside the environment around them. At Gigafactory Berlin, Tesla planted over 1 million trees in 2024 to help offset the footprint of Giga Berlin. Tesla originally cut down less than half that amount of trees.
One of Tesla’s major goals is to electrify the planet and reduce carbon emissions - and what better way to do that than to create more green space. Tesla has some absolutely fantastic green-centric initiatives, and Giga Berlin really showed that off when they planted double the acreage of forest that they cut down to build the facility.
A rendering of the expected finished locations.
Not a Tesla App
Ecological Paradise
Giga Berlin isn’t the only place that Tesla is working to bring some green to - Giga Texas is also on the list. Tesla has shown off their official renderings and plans for the upcoming Ecological Paradise that will be build around Giga Texas.
Tesla has some absolutely staggering plans for the location, and it's far more than a simple park. With nearly 70,000 residents within 15 minutes of Giga Texas and 15,000 employees commuting into work, there is a considerable number of people to appeal to. That also comes with nearly 4 miles of riverfront - so Tesla will be making the best use of the space to benefit the local community and its employees.
A map of the planned paths, routes, and areas.
Not a Tesla App
The plans for the ecological paradise include a 2-acre riverfront rain garden, as well as 6 rainwater buffer and treatment ponds - all intended to enhance, protect, and expand the 53 acres of local wetland - for a total of 150 million gallons of reclaimed rainwater that will be recycled for landscape irrigation.
Just like green spaces, Tesla places considerable emphasis on water reuse and aims to make its factories as water-neutral as possible. This Ecological Paradise will help to offset the water usage from Giga Texas while also greening up the entire community around it - because all four of the new trailheads, as well as the riverbank, will be publicly accessible.
You can check out the entire filing for the Ecological Paradise, which includes a benefit report to the local community, at this link here. It looks like quite a bit of green space, which will go right alongside the 32,400kW of solar panels installed at Giga Texas - which is the largest single installation of panels in the world.