Tesla design lead, Pawel Pietryka, who left the company last month has now shown off some prototype design videos of the Cybertruck and Model 3/Y on his website as part of his portfolio. He showed some interesting footage of the Cybertruck and the Model 3/Y UI. We covered the Model 3/Y video in what could be a part of the new FSD visualizations here if you haven't seen it yet.
These videos appear to be recent based on the Model 3 video containing many recent UI changes. There’s a lot we haven’t seen of the Cybertruck so this video is especially interesting since in some areas it gives us a first peek into the Cybertruck UI.
The Tesla Model S and X use similar UI elements as the Model 3 and Y. The design language used is essentially the same and the UI only differs when necessary, based on feature or hardware changes.
However, based on this video it doesn’t look like Tesla has any plans to merge the Cybertruck UI with the rest of the Tesla models. As unique as the Cybertruck is on the outside, it looks like it will be just as unique on the inside.
Tesla Cybertruck UI
You can check out the full video below.
Although the Cybertruck will be similar to the Model 3/Y with its single, center-mounted screen, it seems that it will differ in that it will mostly use a single app on screen at a time.
Whereas the Model 3 will display visualizations on the left side and the music or maps app on the right side, the Cybertruck incorporates the visualizations and the map into a single view. It shows the truck’s visualization inside of the map to let you know where you are on the map, instead of a standard arrow.
In order for the visualizations and surroundings to still be useful, the map has a much more zoomed-in view. It’s an interesting concept that merges the near-by environment renderings that the truck’s cameras are able to determine with the further-away map data like buildings and streets. It is essentially merging real-time environment data with pre-mapped data into a single view. It makes a lot of sense in some ways, but you do lose the big-picture route view that the map usually provides.
When the truck is parked and there’s no need for a map view, then the truck visualization will take up the whole screen, allowing you to open or close the frunk, tailgate or adjust the suspension in this nice full-screen type view.
We also get our first look at the HVAC controls. In this particular instance, since the truck is driving, the HVAC controls fluidly slide in from the left, revealing similar controls to current Tesla models, but with the Cybertruck interior.
The controls look very simplified and only include a power button, temperature control and vent direction control, but on closer inspection it looks like you’ll be able to slide over to reveal another HVAC pane that may include additional controls, like the ability to keep the climate on when exiting the car, turning on/off the AC and the ability to control the air recirculation.
All signs point to the Cybertruck doing extremely well, but one thing for sure is that the Cybertruck will be a very different and interesting vehicle.
Last week, Mark Ruber, an engineering YouTuber best known for his glitter bombs, released a video where he tested Tesla's Autopilot against various conditions - including the iconic ACME painted wall.
During this test, many people noted that Mark was using Autopilot rather than FSD, even though his video was titled “Can you Fool a Self-Driving Car?”. The Tesla on Autopilot went up against a vehicle equipped with Luminar’s LIDAR rig, running some sort of basic autonomy or safety software.
New Video Tests FSD
Many people were disappointed with Mark’s video and his testing methods, so several creators got to work to actually test out Tesla’s FSD.
Creator Kyle Paul over on X made a much better follow-up video, using both a HW3 Model Y as well as an AI4 Cybertruck. In a relatively unsurprising turn of events, the Cybertruck was successfully able to detect the wall, slowed down, and came to a stop. The Cybertruck was running FSD 13.2.8.
Kyle’s team did a fantastic job building the wall and testing this in a private area using FSD rather than Autopilot. On top of that - they re-tested the results several times and recorded the entire thing in and out. While Mark’s video was more for entertainment, Kyle really set out to prove what would really happen in this unlikely scenario.
Sadly, the HW3 Model Y was unable to detect the wall, and manual intervention was required in each test. While the Model Y was running FSD 12.5.4.2 rather than an FSD V12.6 build, we don’t expect this to have had a significant impact on the test - this is more of an issue with how computer vision analyzes the environment.
There are several major differences between HW3 and HW4. The first is obviously that the version that runs on AI4 is more advanced, as the hardware is capable of processing a lot more data. However, AI4 also features much higher-resolution cameras than HW3, and Tesla recently added the ability for the video feeds to be processed at full resolution on FSD V13. This could have made the difference, although it’s not entirely clear. Perhaps if HW3 gets a version of FSD V13 in the future, HW3 can be retested to see if it passes the “ACME wall” test.
Watch
Kyle’s entire video is below. It’s only 10 minutes long, so definitely give it a watch. Props to Kyle on the quick and thorough execution.
What Does This Mean for FSD?
We broke down Mark’s test - and examined all the little issues that we discovered after doing some in-depth research - you can read our analysis here.
Putting aside the issues with Mark’s testing and instead using the new results - it seems that if you were to have to fight against Wile-E-Coyote and his ACME tools with your Tesla, cartoon logic may win if you’re on an HW3 vehicle. If you’re on an AI4 vehicle, you’ll likely come to a safe stop.
Vehicle depth perception is definitely something that Tesla has been hard at work to improve - and some fairly drastic improvements came with FSD V13 that haven’t been entirely translated to FSD V12 just yet. Future versions of HW3 FSD may be able to determine that the wall is there successfully. So Kyle - if you’re reading this - don’t get rid of that wall. We’d love to see more testing in the future.
However, this entire test scenario is so out of left field… there is a good likelihood this same test would fool some human drivers as well. The most important part is that the future of autonomy will not fall for these tricks, so it's very unlikely for someone to weaponize this idea as it’d only possibly work on a small segment of vehicles.
If Wile-E-Coyote is after you, someone else may drive into the wall before your Tesla does.
Not a Tesla App
We’re not kidding, this really happened already. This isn’t a realistic scenario outside of someone trying to play an insane prank - but it’s good to know that FSD V13 is capable of dodging this.
Tesla regularly holds quarterly all-hands meetings for employees, but last night marks the first time Tesla has live-streamed the event for the public.
The meeting primarily focused on Tesla’s employees, recent achievements, and the future of Tesla. While it didn’t reveal much new information, it was interesting to see Elon Musk candidly engaging with his teams, who seem to genuinely enjoy working with him. Still, there were a few noteworthy takeaways.
As with Tesla’s Earnings Calls and other live events, we’ve put together a concise, easy-to-digest recap of everything discussed.
General Points
Work-related Injuries declined over time
Planning to expand to new markets
Cell Manufacturing
Continuing to invest in battery supply
Cheapest, lowest cost per kWh cell
The supercharger network continues to grow
Vehicle range and charging speed should match the needs of humans and their required time for breaks