Elon Musk says we may get FSD Beta v11.3 as early as this week
Not a Tesla App
Elon Musk has provided further guidance on Tesla's next major release of FSD Beta, v11.3. Last week Musk said that v11.3 would be ready in about 'two weeks' and would contain many major improvements.
We've heard these two-week estimates before, but it's reassuring that Musk is being specific with details on this upcoming release.
Now just seven days after his last tweet, Elon Musk is giving us more details on this significant upgrade.
Timeline
Musk said he expects FSD Beta v11.3 to start rolling out to some customers later this week, or next week at the latest. This matches up with his initial two weeks estimate a week ago.
Elon Musk's estimates are known to be overly optimistic, but given the number of details he's releasing, it sounds like Tesla may be close to releasing this next build.
FSD Beta v11.3 will likely roll out to Tesla employees first and then go out to select FSD Beta testers, possibly the original OG group. Since this is a major milestone and includes major improvements, expect a slow and gradual rollout. Although we may get our first glimpse into FSD Beta v11 this week, it may be several weeks or more before the majority of customers have access to this beta.
Neural Nets for Vehicle Behavior
A week ago Musk said this upgrade will include 'many major improvements.' Last night Musk revealed some additional details. He said there will be "many small things," one of which will be that Tesla will begin to use neural nets for vehicle navigation and control, instead of just vision.
Today Tesla uses neural networks to determine the vehicle's surroundings, where objects are, what they are, and their distances from the vehicle to create a 3D environment known as 'vector space.' With this information, the vehicle can then plan a path and navigate around these objects toward its destination.
However, based on Musk's comment, it sounds like Tesla is currently only using neural nets to determine its environment and not for controlling the vehicle. This means that how the vehicle behaves, how it finds a path, and how it moves is still a process that is coded traditionally.
In the same way that Tesla uses millions of images to determine what a stop sign or traffic cone is, it sounds like Tesla will now use a large number of examples to determine how to best control the vehicle in various situations.
That could mean that Tesla will take millions of quality examples of how to gradually accelerate or slow down, based on real driving behavior to determine how the vehicle should accelerate in different situations.
This could be applied to every driving characteristic such as turning, slow downing, driving around a parked vehicle, etc.
If Tesla starts leveraging neural nets to aid vehicle control we may soon see drastic improvements to vehicle behavior, making it much smoother and human-like.
New Features
Although Elon Musk didn't specifically mention new features coming to FSD Beta v11.3, there are several that have been talked about in the past that could show up in this major full self-driving update.
Reverse Creep
Reverse Creep has been a feature that has been talked about as far back as FSD Beta 10.13. This feature would allow the vehicle to go into reverse to move out of the way of danger or adjust its trajectory. Right now FSD Beta will only ever move forward, so this improvement would be a giant step toward achieving human-like behavior.
A good use case for this is when the vehicle moves forward for better visibility. There may be times when the vehicle sees a car coming after moving creeping forward. In these cases, it'd be smart to let the vehicle reverse back to its previous position if it is now in the path of traffic.
Navigating Without Map Data & GPS
In the past Musk also alluded to the fact that Tesla is working on the neural networks' ability to complete 'dead reckoning' navigation, which is navigating based only on inertial measurements such as speed, direction and wheel movement.
He gave underground parking garages as an example of where FSD would need the ability to navigate without GPS or map data.
The car will be able to do this by using its last known GPS location and then determine its future location using only a compass, wheel movement and speed.
FSD Beta v11
FSD Beta v11 has always been expected to be a big leap forward, and as we get closer, that hasn't changed. This update is expected to be a huge improvement to what is currently available to customers. Although Musk's timelines have usually shifted and features have typically taken longer than initially planned, it looks like we may be getting close to the next major release for FSD Beta.
Although we're still years away from true full self-driving, Tesla's mission inches closer with every update.
Recently there was also a leak revealing some details of Tesla's upcoming cameras in hardware 4.0, which are expected to include a fan and heater for select cameras.
Subscribe
Subscribe to our newsletter to stay up to date on the latest Tesla news, upcoming features and software updates.
Another quarter has passed, and that means it’s time to submit questions and vote for Tesla’s Q2 2025 Earnings Call. While Q1 was a tough quarter for the company, Q2 saw some recovery in sales, although there’s still some work to be done.
However, there’s always a lot to be excited about during Tesla’s Q&A session, where we usually learn a lot about future software improvements and upcoming vehicles. We may hear more about FSD Unsupervised, Robotaxi, or the more affordable vehicle, or its upcoming larger 6-seater Model Y, the Model Y L. Tesla also mentioned a potential FSD price hike back in the Q1 2025 Earnings Call, so that could be something that is brought up as well.
Tesla’s Q2 So Far
Tesla has already released their Q2 2025 Production and Delivery numbers, which were up from Q1 of this year, but still down compared to Q2 last year.
Production
Deliveries
Model 3/Y
396,835
373,728
Model S, X, and Cybertruck
13,409
10,394
Total
410,244
384,122
How to Submit & Vote
Tesla lets shareholders submit a question that will be voted on and may be answered during the Q&A session. To submit your own question or vote on an already submitted question, you’ll need to be a verified shareholder. You can go to Say’s platform and link your brokerage accounts.
Once it is verified, you’ll be able to log in and vote your shares on your own question, or on someone else’s question.
Here’s the link to get started on Say’s Tesla Q&A. You must submit your questions and votes by July 23rd, 2025, at 4:00 PM EDT.
Top Questions So Far
Unsurprisingly, people have already been submitting questions, and here are the top ones so far.
Can you give us some insight how robotaxis have been performing so far and what rate you expect to expand in terms of vehicles, geofence, cities, and supervisors?
What are the key technical and regulatory hurdles still remaining for unsupervised FSD to be available for personal use? Timeline?
What specific factory tasks is Optimus currently performing, and what is the expected timeline for scaling production to enable external sales? How does Tesla envision Optimus contributing to revenue in the next 2–3 years?
Can you provide an update on the development and production timeline for Tesla’s more affordable models? How will these models balance cost reduction with profitability, and what impact do you expect on demand in the current economic climate?
Are there any news for HW3 users getting retrofits or upgrades? Will they get HW4 or some future version of HW5?
When do you anticipate customer vehicles to receive unsupervised FSD?
And here are some other ones we found interesting:
Have any meaningful Optimus milestones changed for this year or next and will thousands of Optimus be performing tasks in Tesla factories by year end?
Are front bumper cameras going to be necessary for unsupervised full self driving? If so, what is the companies plan to retrofit vehicles that do not have them?
Will there be a new AI day to explain the advancements the Autopilot, Optimus, and Dojo/chip teams have made over the past several years. We still do not know much about the HW4.
Earnings Call Details
Tesla will hold its earnings call on Wednesday, July 23rd, at 4:00 PM EDT. It's still early for an access link, but we’ll make sure we have a link up on the site before the earnings call that day.
If you do miss the earnings call, no worries. We will provide a full recap following the call, and we’ll also do some in-depth dives into what was said and what we know.
Tesla’s Summer Update, 2025.26, has finally launched, bringing with it a batch of interesting new features for some, and a bunch of quality-of-life improvements for everyone else.
Grok AI Assistant
The star of the Summer Update is Grok, xAI’s conversational AI assistant, which has now landed in Tesla vehicles. For now, it's available in any Tesla that has an AMD processor and is potentially coming to Intel-based vehicles in the near future. The feature is also only available in the U.S., but it’s expected to expand to other regions — hopefully soon.
Grok is in its first iteration as an in-vehicle assistant, and for now, cannot control the vehicle, which means that Tesla’s voice command system is still intact. However, there is a lot it can do already. Grok is activated by pressing and holding the voice button (right scroll wheel on older vehicles), while a short press of the button is still reserved for voice commands. Grok will support a wake word in the future, letting you activate it without pressing a button.
You don't need to sign into Grok to use it in your Tesla!
It can't currently control anything in the car, but it does seem context aware that it is in a vehicle. pic.twitter.com/IpatR7sjiJ
Once Grok is open, which can also be done by tapping the Grok app icon, users can tailor the AI personality according to their preferences by selecting a persona and voice of their choice.
There are also several other settings for Grok under the settings button. You can enable NSFW mode, Kids Mode, or disable access to your vehicle’s location.
Grok has contextual awareness of your vehicle location, which means it can provide relevant answers to questions like “Where should I go for dinner?”
Logging In Not Required
Grok is free with Premium Connectivity, or if you’re using your phone’s hotspot feature or connected to WiFi, so anyone can try it for free. In fact, you don’t even need to log in to start using Grok. However, logging in adds some additional features.
If you’d like to log in, you can do so by scanning the QR code in the vehicle, which will provide chat management and transcripts, SuperGrok access (if you pay for a subscription), and better privacy control.
Tesla has added a new Light Sync feature that pulses the vehicle’s ambient lighting in sync with the music being played. This option is turned on under Toybox > Light Sync. There are also a few options, including the ability to match the ambient light colors to the album’s artwork, instead of using your selected color.
In addition, in Park you can enable Rave Cave when parked, which cranks up the ambient lighting brightness to the maximum.
Dashcam App Update
The Dashcam app now allows you to adjust playback speeds, just like the older Dashcam Viewer, which is still used on Intel-based vehicles.
In addition to adjusting playback speed, you can now adjust the video view so that it’s displayed without being obstructed by the buttons at the top (video below). The difference is small, but could be useful if you’re trying to see something slightly out of view or that’s hidden behind the top Dashcam buttons.
While the Cybertruck has also received the updated Dashcam Viewer with this update, it does not have the new B-pillar camera recordings like other HW4 cars.
Sentry Mode is getting one of the best uses of the vehicle’s ambient lighting that we’ve seen so far. The ambient lighting will now slowly pulse red while Sentry Mode is activated to grab someone’s attention, instead of just relying on the vehicle’s display.
While you can disable Sentry Mode sounds, we’d love to see an even more stealthy Sentry Mode that also disables the ambient lighting and screen, allowing the vehicle to record without anyone being aware.
Since the ambient lighting is being used in this case to make people more aware of a feature, Tesla could also use it in other modes, such as Dog Mode.
There are a lot of potential uses for ambient lighting. Tesla can make it glow while the vehicle is charging, with the brightness potentially related to the vehicle's charge level.
When you navigate to a Supercharger, new icons in the charger list will indicate locations that require valet service or pay-to-park access.
Upon arrival at the location, a notification will appear on your screen, displaying important details such as access codes, parking restrictions, level/floor information for parking garages, and restroom availability. This information will also be available on the site card in the navigation.
Equalizer Presets
max_bracco/X
Tesla has moved the audio settings from the music player, directly into the vehicle settings — making them much easier to find. In addition to creating a new “Audio” section in settings, you can now create and save equalizer presets.
Each preset can have a name, custom EQ settings, and a setting for immersive audio.
max_bracco/X
Onboarding Guide
Not a Tesla App
Tesla has introduced a new Onboarding Guide for new owners. The guide covers driver settings, touchscreen use, steering wheel and seat setup, and how to control key portions of the vehicle, including lights, wipers, and Autopilot features.
The Onboarding Guide is automatically initiated when a new owner accepts delivery of a Tesla, or can be manually initiated at any time by going to Controls > Service > Onboarding Guide.
This appears to only be available for the new Model 3 and new Model Y.
In typical Tesla fashion, this 2025.26 update is rolling out gradually in small waves initially. Three waves have already gone out, so all signs are looking good for it going into a wide release soon.