Volkswagen is jumping head-first into the EV world. They’re not hedging their bets like other manufacturers. And they’re likely to do well, very well in fact. Their latest all-electric EV is the new ID.4.
The name may leave something to be desired, but it’s a solid entry into the EV space. One interesting feature the ID.4 will offer is voice commands. Similarly to Tesla you’ll be able to adjust certain settings with just your voice. However there are a few differentiating features that Tesla may want to look into if they’re not already developing them.
First, there’s a status light, much like your monitor, phone or laptop, there’s a status light that lets you know the state of the car. The status light is a thin stripe along the bottom of the windshield. You can think of it much like the Alexa ring which lights up when it hears the word “Alexa.”
The ID. Light feature is used to help give feedback to the user. For example, it turns green when the vehicle is charging so that you can easily see it from your garage. It will also light up in various colors or areas if the driver receives a phone call or engages the turn signal. It’s unclear whether this is going to be overused, but the idea is a good one.
The vehicle will also support hands-free voice activation. Similar to other smart assistants like Siri and Alexa, you’ll be able to start a voice command with just your voice. By saying, “Hello, ID,” you’ll be able to start a voice command, instead of having to push the voice command button on a Tesla. This will activate the ID. Light. Again, very similar to other smart assistants.
VW appears to be going full voice assistant. We're not sure everyone wants audible feedback or the ability to answer any question, but it could be useful at times. What's clear though is that VW appears to go beyond usefulness and into the land of gimicky in the video above. We don't need a welcome message when we get in the car, and the voice is way too robot sounding and dragged out. It sounds like something you'd hear in a 1980's movie portraying the future. However, if the voice assistant was toned down and as useful as Google Assistant then it could be a nice addition. I can say personally that I sometimes have wished that I could easily hear back Tesla's current stock price, or find out the weather for tomorrow.
We hope Tesla is watching and considering adding similar features to Teslas. Hands-free voice activation and a thin light at the top or bottom of the main display could be great additions. What’s great is that Teslas already have all the hardware needed.
For all we know, we may be saying "Hey, Tesla," sometime soon. The mysterious V11 is sure to include many great features and it's a possibilty this could be included in that release or a future software update down the road.
If you haven't already seen all the voice commands that are supported in your Tesla, check out our full list of voice commands.
Last week, Mark Ruber, an engineering YouTuber best known for his glitter bombs, released a video where he tested Tesla's Autopilot against various conditions - including the iconic ACME painted wall.
During this test, many people noted that Mark was using Autopilot rather than FSD, even though his video was titled “Can you Fool a Self-Driving Car?”. The Tesla on Autopilot went up against a vehicle equipped with Luminar’s LIDAR rig, running some sort of basic autonomy or safety software.
New Video Tests FSD
Many people were disappointed with Mark’s video and his testing methods, so several creators got to work to actually test out Tesla’s FSD.
Creator Kyle Paul over on X made a much better follow-up video, using both a HW3 Model Y as well as an AI4 Cybertruck. In a relatively unsurprising turn of events, the Cybertruck was successfully able to detect the wall, slowed down, and came to a stop. The Cybertruck was running FSD 13.2.8.
Kyle’s team did a fantastic job building the wall and testing this in a private area using FSD rather than Autopilot. On top of that - they re-tested the results several times and recorded the entire thing in and out. While Mark’s video was more for entertainment, Kyle really set out to prove what would really happen in this unlikely scenario.
Sadly, the HW3 Model Y was unable to detect the wall, and manual intervention was required in each test. While the Model Y was running FSD 12.5.4.2 rather than an FSD V12.6 build, we don’t expect this to have had a significant impact on the test - this is more of an issue with how computer vision analyzes the environment.
There are several major differences between HW3 and HW4. The first is obviously that the version that runs on AI4 is more advanced, as the hardware is capable of processing a lot more data. However, AI4 also features much higher-resolution cameras than HW3, and Tesla recently added the ability for the video feeds to be processed at full resolution on FSD V13. This could have made the difference, although it’s not entirely clear. Perhaps if HW3 gets a version of FSD V13 in the future, HW3 can be retested to see if it passes the “ACME wall” test.
Watch
Kyle’s entire video is below. It’s only 10 minutes long, so definitely give it a watch. Props to Kyle on the quick and thorough execution.
What Does This Mean for FSD?
We broke down Mark’s test - and examined all the little issues that we discovered after doing some in-depth research - you can read our analysis here.
Putting aside the issues with Mark’s testing and instead using the new results - it seems that if you were to have to fight against Wile-E-Coyote and his ACME tools with your Tesla, cartoon logic may win if you’re on an HW3 vehicle. If you’re on an AI4 vehicle, you’ll likely come to a safe stop.
Vehicle depth perception is definitely something that Tesla has been hard at work to improve - and some fairly drastic improvements came with FSD V13 that haven’t been entirely translated to FSD V12 just yet. Future versions of HW3 FSD may be able to determine that the wall is there successfully. So Kyle - if you’re reading this - don’t get rid of that wall. We’d love to see more testing in the future.
However, this entire test scenario is so out of left field… there is a good likelihood this same test would fool some human drivers as well. The most important part is that the future of autonomy will not fall for these tricks, so it's very unlikely for someone to weaponize this idea as it’d only possibly work on a small segment of vehicles.
If Wile-E-Coyote is after you, someone else may drive into the wall before your Tesla does.
Not a Tesla App
We’re not kidding, this really happened already. This isn’t a realistic scenario outside of someone trying to play an insane prank - but it’s good to know that FSD V13 is capable of dodging this.
Tesla regularly holds quarterly all-hands meetings for employees, but last night marks the first time Tesla has live-streamed the event for the public.
The meeting primarily focused on Tesla’s employees, recent achievements, and the future of Tesla. While it didn’t reveal much new information, it was interesting to see Elon Musk candidly engaging with his teams, who seem to genuinely enjoy working with him. Still, there were a few noteworthy takeaways.
As with Tesla’s Earnings Calls and other live events, we’ve put together a concise, easy-to-digest recap of everything discussed.
General Points
Work-related Injuries declined over time
Planning to expand to new markets
Cell Manufacturing
Continuing to invest in battery supply
Cheapest, lowest cost per kWh cell
The supercharger network continues to grow
Vehicle range and charging speed should match the needs of humans and their required time for breaks