Tesla’s last app update, version 4.37.1, introduces support for critical alerts on the iPhone. These alerts are designed for emergency situations and will override standard notification settings, including the mute switch, Focus modes, and Do Not Disturb. This ensures that owners are notified when immediate action is required, even if their phone is set to silent.
How to Enable Critical Alerts
Activate Dog Mode: Open the Tesla app and turn on Dog Mode.
Main App Screen: After activating Dog Mode, go back to the main section of the app.
Enable Critical Alerts: A new option called Critical Alerts will appear under Live Camera and the quick action icons (Tip: You can add up to 5 quick actions). It’ll state, “Grant permissions to receive critical notifications.” Tap it, and you’ll see a system dialog asking for permission to allow critical alerts for the Tesla app.
Confirm Permission: Choose to allow the Tesla app to send critical notifications, which will grant the app special privileges for sending urgent alerts.
This new feature is likely intended for situations where Dog Mode needs to be turned off unexpectedly. For example, if the vehicle’s battery drops below 20% or if the temperature inside the cabin rises or falls drastically outside of your set temperature, the Tesla app is expected to send a critical alert to the owner, ensuring they are informed right away.
What Are Critical Alerts?
Critical Alerts are a type of iOS notification available only to approved apps, which must meet specific criteria set by Apple. These alerts are designed to be used in emergencies and bypass all standard notification restrictions. This means that regardless of whether your phone is silenced or in Do Not Disturb mode, the alert will play a sound and appear prominently.
By adding support for Critical Alerts, Tesla is prioritizing the safety of pets who may be left inside the vehicle with the expectation that Dog Mode will keep conditions safe. This change makes the app more reliable for alerting owners when immediate attention is needed, ensuring they are always kept informed—even when their phone’s settings would typically prevent other notifications from coming through.
Subscribe
Subscribe to our newsletter to stay up to date on the latest Tesla news, upcoming features and software updates.
Last week, Mark Ruber, an engineering YouTuber best known for his glitter bombs, released a video where he tested Tesla's Autopilot against various conditions - including the iconic ACME painted wall.
During this test, many people noted that Mark was using Autopilot rather than FSD, even though his video was titled “Can you Fool a Self-Driving Car?”. The Tesla on Autopilot went up against a vehicle equipped with Luminar’s LIDAR rig, running some sort of basic autonomy or safety software.
New Video Tests FSD
Many people were disappointed with Mark’s video and his testing methods, so several creators got to work to actually test out Tesla’s FSD.
Creator Kyle Paul over on X made a much better follow-up video, using both a HW3 Model Y as well as an AI4 Cybertruck. In a relatively unsurprising turn of events, the Cybertruck was successfully able to detect the wall, slowed down, and came to a stop. The Cybertruck was running FSD 13.2.8.
Kyle’s team did a fantastic job building the wall and testing this in a private area using FSD rather than Autopilot. On top of that - they re-tested the results several times and recorded the entire thing in and out. While Mark’s video was more for entertainment, Kyle really set out to prove what would really happen in this unlikely scenario.
Sadly, the HW3 Model Y was unable to detect the wall, and manual intervention was required in each test. While the Model Y was running FSD 12.5.4.2 rather than an FSD V12.6 build, we don’t expect this to have had a significant impact on the test - this is more of an issue with how computer vision analyzes the environment.
There are several major differences between HW3 and HW4. The first is obviously that the version that runs on AI4 is more advanced, as the hardware is capable of processing a lot more data. However, AI4 also features much higher-resolution cameras than HW3, and Tesla recently added the ability for the video feeds to be processed at full resolution on FSD V13. This could have made the difference, although it’s not entirely clear. Perhaps if HW3 gets a version of FSD V13 in the future, HW3 can be retested to see if it passes the “ACME wall” test.
Watch
Kyle’s entire video is below. It’s only 10 minutes long, so definitely give it a watch. Props to Kyle on the quick and thorough execution.
What Does This Mean for FSD?
We broke down Mark’s test - and examined all the little issues that we discovered after doing some in-depth research - you can read our analysis here.
Putting aside the issues with Mark’s testing and instead using the new results - it seems that if you were to have to fight against Wile-E-Coyote and his ACME tools with your Tesla, cartoon logic may win if you’re on an HW3 vehicle. If you’re on an AI4 vehicle, you’ll likely come to a safe stop.
Vehicle depth perception is definitely something that Tesla has been hard at work to improve - and some fairly drastic improvements came with FSD V13 that haven’t been entirely translated to FSD V12 just yet. Future versions of HW3 FSD may be able to determine that the wall is there successfully. So Kyle - if you’re reading this - don’t get rid of that wall. We’d love to see more testing in the future.
However, this entire test scenario is so out of left field… there is a good likelihood this same test would fool some human drivers as well. The most important part is that the future of autonomy will not fall for these tricks, so it's very unlikely for someone to weaponize this idea as it’d only possibly work on a small segment of vehicles.
If Wile-E-Coyote is after you, someone else may drive into the wall before your Tesla does.
Not a Tesla App
We’re not kidding, this really happened already. This isn’t a realistic scenario outside of someone trying to play an insane prank - but it’s good to know that FSD V13 is capable of dodging this.
Tesla regularly holds quarterly all-hands meetings for employees, but last night marks the first time Tesla has live-streamed the event for the public.
The meeting primarily focused on Tesla’s employees, recent achievements, and the future of Tesla. While it didn’t reveal much new information, it was interesting to see Elon Musk candidly engaging with his teams, who seem to genuinely enjoy working with him. Still, there were a few noteworthy takeaways.
As with Tesla’s Earnings Calls and other live events, we’ve put together a concise, easy-to-digest recap of everything discussed.
General Points
Work-related Injuries declined over time
Planning to expand to new markets
Cell Manufacturing
Continuing to invest in battery supply
Cheapest, lowest cost per kWh cell
The supercharger network continues to grow
Vehicle range and charging speed should match the needs of humans and their required time for breaks