On Sunday, Tesla started rolling out Full Self-Driving (Supervised) V12.3.6, the latest version of its FSD software. This update includes the highly anticipated new Autopark and High Fidelity Park Assist feature for additional vehicles. FSD v12.3.6 replaces v12.3.5 which had only been rolled out to a small portion of the fleet.
The new Autopark (tap to park) and High Fidelity Park Assist features are now available for vehicles with ultrasonic sensors (USS). Since October 2022, all Model 3 and Model Y vehicles no longer include ultrasonic sensors, instead relying solely on Tesla Vision to provide Autopilot, Park Assist, and Autopark features. However, the vision-based Autopark has been limited to the U.S. and Canada so far.
New Autopark
When driving at low speed, the new Autopark highlights potential parking spaces allowing the driver to pick their preferred spot. Tesla vehicles with ultrasonic sensors (USS) can now take advantage of the new Autopark feature, which is a significant improvement over the previous iteration.
Although the new Autopark feature is expanding to vehicles with USS, it appears to still be geographically limited to the U.S. and Canada. We expect Tesla to continue the feedback loop and release the feature in Europe and other regions in a future update.
Vehicles without USS outside of North America have never had Autopark of any form since they don’t support the older Autopark version that relies on USS and have yet to receive this latest revision of the feature.
For vehicles with the Intel-based infotainment unit, the visuals will look like the video below, where it doesn’t display a 3D environment of the vehicle’s surroundings. Instead, the visualization just highlights parking spaces available. However, the limitation for Intel vehicles is just in the visualization itself, the vehicle is just as aware of its surroundings as vehicles that display 3D renderings of objects on the screen.
With Tesla update 2024.3.25 (FSD v12.3.6), Tesla is also releasing High Fidelity Park Assist features to vehicles with ultrasonic sensors. However, as initially suspected, it’s limited to vehicles with the most recent infotainment processor that includes the Ryzen chip (MCU 3). Unfortunately, Intel Atom based vehicles don’t include the 3D visuals of High Fidelity Park Assist.
The feature provides drivers with a 360-degree 3D reconstruction of their vehicles’ surroundings while parking while traveling at low speeds. It even accurately displays lane markers in parking lots helping drivers visualize the environment around them when parking. The feature was added as a late addition to Tesla’s 2023 holiday update. However, at the time, it was limited to vehicles without ultrasonic sensors (USS).
For vehicles that have ultrasonic sensors, users will have a choice to either continue using their vehicle with USS sensors that display exact distances to objects or use the new High Fidelity Park Assist feature and forego the display of distances.
We were hoping that when Tesla finally released HiFi Park Assist to vehicles with USS, it would merge the two features and display the updated visuals with distance measurements, however, that is not the case in this update.
For owners with USS, the new Park Assist option is located under Controls > Autopilot and allows you to choose between “Standard” or “Tesla Vision.” Tesla Vision being the new HiFi Park Assist and Standard representing the USS version with arcs and measurements.
Tesla’s director of Autopilot, Ashok Elluswamy had previously set expectations and stated that HiFi Park Assist would “eventually” go to vehicles with ultrasonic sensors back in December 2023. The new Park Assist feature is available in various regions around the world, including North America and most of Europe.
Tesla continues to double down on vision and Musk revealed that it's becoming “very clear that the vision-based approach with end-to-end neural networks is the right solution for scalable autonomy”.
Subscribe
Subscribe to our newsletter to stay up to date on the latest Tesla news, upcoming features and software updates.
When the 2024 Tesla Holiday Update originally launched, they introduced awesome new features, but unfortunately, one of the most exciting, the weather radar, was only available for vehicles with the AMD Ryzen processor (MCU 3).
Intel-based vehicles didn’t receive the precipitation maps at all and instead only had access to the Weather at Destination feature. However, we’re excited to report that Tesla has now released a version of the weather radar overlay that’s compatible with Intel vehicles.
Intel Precipitation Maps
It sounds like Tesla needed to optimize the precipitation map for Intel vehicles, which feature a slower CPU. We received this news from a follower who reached out after finding his Intel-based Model 3 in Norway had the ability to see the new weather map with update 2024.44.25.3.
This update began rolling out just recently and was originally seen as a bugfix update. In the release notes for the update, Tesla lists all of the Holiday features again, making it easy to miss the new ‘Precipitation Map and Weather at Destination’ feature.
While we thought the radar overlay feature may be reduced in some fashion for Intel cars, this isn’t the case. This update brings the maps in their full capacity to Intel, with no reduction in features. The animated overlay, showcasing the last three hours of precipitation, is available in the same way as AMD vehicles.
The precipitation icon shows up at the far right side of the screen, right next to the Superchargers icon.
You’re also able to zoom out or swipe to view the radar anywhere you like or zoom in to be more precise. You can also use it while driving. However, the overlay is pretty distracting, so it can make it confusing if you’re trying to figure out where to go.
The last two key points for the new precipitation maps are that if you have Points of Interest (POIs) enabled, the precipitation overlay will hide them - except for Charging POIs. You’ll also need Premium Connectivity to take advantage of this feature - even if you’re connected to Wi-Fi via a hotspot.
We’re excited to see this feature drop for Intel-based vehicles, as Tesla is still committed to supporting them and finding new ways to optimize features on the older hardware. We’re hoping some of the other features arrive on Intel as well, including the new Parked screen and updated full-screen Autopilot visualizations outside of North America.
With FSD V13.2.1 finally rolling out to HW4/AI4 vehicle owners this week, we’ve been super excited to see all the new features, including Park, Unpark, and Reverse in action for the first time.
However, that’s not everything - more is coming soon. We previously reported that Tesla is collecting audio input to build neural networks for audio, and now we’re learning that that capability will arrive in FSD V13.4.
Better Audio Handling
Ashok Elluswamy, Tesla’s VP of AI, mentioned that better handling of audio inputs is coming as part of FSD V13.4. That’ll be an interesting change, as the current handling of emergency vehicles on V13.2.1 is already pretty good.
However, we’re sure that being able to recognize emergency vehicles audibly will improve detection speed and reliability. Similarly to vision, FSD will start analyzing all the sounds it hears, and look for signs of emergency vehicles.
FSD will be able to make a reasonable determination on whether the sound of the siren is approaching or just echoing off of nearby terrain or buildings using the Doppler effect. It’s a simple mathematical principle where the frequency of a sound wave increases as the source moves towards the observer and decreases as it moves away.
Interestingly, Tesla will be using the internal microphone for this task - as there are no external microphones on any Teslas… yet. This microphone is sufficient for one simple reason - sirens are made loud enough for humans to hear them inside a moving car.
Better Than a Human
Some users have wondered how the vehicle will be able to distinguish between sirens on the radio and in real life. While I’m sure we’re not the only ones to have ever been fooled by a siren on the radio, Teslas won’t be as easily fooled.
Tesla could actually take the audio going out to the radio and remove it from the sounds captured by the microphone, effectively removing sirens from the captured audio. In addition to being able to measure the intensity and direction of the sound, your vehicle should be able to accurately recognize emergency vehicles, even before a human can.
Opt-In Audio Sharing
Tesla is now allowing FSD users to opt-in to sharing audio data. The prompt for sharing audio data is on FSD V12.5.6.4, V13.2 and V13.2.1. It’s also expected to be in the upcoming Hardware 3 version of FSD 12.6.
However, it’s worth noting that Tesla’s release notes between V13.2 and V13.2.1 changed slightly for audio sharing. Tesla initially mentioned that the vehicle would capture 10-second audio clips when a siren is heard.
In FSD V13.2.1, Tesla updated the data sharing feature, letting users know that audio recordings are now said to be captured when the vehicle detects an emergency vehicle instead of detecting a siren. The audio clip is also not limited to 10 seconds anymore.
Opting into audio sharing will share microphone recordings alongside all the other data that Tesla regularly collects as part of its FSD training. Of course, if you're uncomfortable with that, you’ll be able to opt out of just the audio portion. Tesla’s privacy policy also discloses that they anonymize and sanitize the data during collection and processing.
While vision plays a much larger role, expect Tesla to deal with the capturing and analyzing of audio data in a very similar manner.
We’ve already seen improved handling for school buses on V13.2, so we’re excited to see what else Tesla does in the next few months. Perhaps handling school zones would be the next big item to tackle.