Tesla Arcade has set the standard for automotive gaming, but the automaker has bigger plans for gaming in its vehicles.
In the future you'll be able to play Steam games in your Tesla
A recent Twitter conversation between gaming journalist Ryan McCaffrey (@DMC_Ryan) and Tesla CEO, Elon Musk (@elonmusk) revealed that Tesla is looking to make Steam games available in its cars. Steam is the largest video game digital distribution software. Musk mentioned that the company is now working on bringing the broad library of Steam video games to its vehicles rather than adding specific titles individually.
We’re working through the general case of making Steam games work on a Tesla vs specific titles. Former is obviously where we should be long-term.
- Elon Musk
Tesla's infotainment system uses a custom Linux OS and relies upon CPU architectures similar to a desktop PC. This means that Tesla could conceivably make a broad catalog of games on Steam available in its infotainment system. Tesla is likely aiming to enable a plethora of Steam games to run without working with video game studios to port games to it or requiring specific modifications from game developers.
While previous versions of Tesla's infotainment system ran on Intel Atom CPUs, the carmaker recently introduced AMD's Ryzen chipsets (MCU 3) to provide higher performance to be able to run today's modern games.
The new hardware has more storage to be able to handle more games on the platform. MCU 3 is capable of next-gen gaming, supposedly on par with the PS5, which would be optimal if the vehicle is going to run highly detailed video games. Tesla owners can expect an expanding array of video games, although some games may require the new AMD-powered MCU 3 infotainment system.
Musk also confirmed that Cyberpunk 2077 will be playable in the long-awaited Cybertruck for which deliveries are expected to begin next year.
We expect all vehicles with MCU 3 will gain access to the Steam store since Tesla has confirmed that the Cybertruck will be able to play Cyberpunk 2077; one of today's most demanding games.
Tesla gaming remains highly anticipated with owners. Musk believes that “entertainment will be critical when cars drive themselves”, which he thinks Tesla can achieve later this year. Despite the NHTSA clamping down on gaming while moving, it's evident that Tesla is pressing ahead at full steam to offer the widest array of video games in any vehicle.
Ordering a New Tesla?
Consider using our referral code (nuno84363) to get up to $2,000 off your new Tesla and get 3 Months of FSD for free.
At the 2025 Consumer Electronics Show, Nvidia showed off its new consumer graphics cards, home-scale compute machines, and commercial AI offerings. One of these offerings included the new Nvidia Cosmos training system.
Nvidia is a close partner of Tesla - in fact, they produce and supply the GPUs that Tesla uses to train FSD - the H100s and soon-to-be H200s, located at the new Cortex Supercomputing Cluster at Giga Texas. Nvidia will also challenge Tesla’s lead in developing and deploying synthetic training data for an autonomous driving system - something Tesla is already doing.
However, this is far more important for other manufacturers. We’re going to take a look at what Nvidia is offering and how it compares to what Tesla is already doing. We’ve done a few deep dives into how Tesla’s FSD works, how Tesla streamlines FSD, and, more recently, how they optimize FSD. If you want to get familiar with a bit of the lingo and the background knowledge, we recommend reading those articles before continuing, but we’ll do our best to explain how all this synthetic data works.
Nvidia Cosmos
Nvidia’s Cosmos is a generative AI model created to accelerate the development of physical AI systems, including robots and autonomous vehicles. Remember - Tesla’s FSD is also the same software that powers their humanoid robot, Optimus. Nvidia is aiming to tackle physical, real-world deployments of AI anywhere from your home, your street, or your workplace, just like Tesla.
Cosmos is a physics-aware engine that learns from real-world video and builds simulated video inputs. It tokenizes data to help AI systems learn quicker, all based on the video that is input into the system. Sound familiar? That’s exactly how FSD learns as well.
Cosmos also has the capability to do sensor-fused simulations. That means it can take multiple input sources - video, LiDAR, audio, or whatever else the user intends, and fuse them together into a single-world simulation for your AI model to learn from. This helps train, test, and validate autonomous vehicle behavior in a safe, synthetic format while also providing a massive breadth of data.
Data Scaling
Of course, Cosmos itself still requires video input - the more video you feed it, the more simulations it can generate and run. Data scaling is a necessity for AI applications, as you’ll need to feed it an infinite amount of data to build an infinite amount of scenarios for it to train itself on.
Synthetic data also has a problem - is it real? Can it predict real-world situations? In early 2024, Elon Musk commented on this problem, noting that data scales infinitely both in the real world and in simulated data. A better way to gather testing data is through real-world data. After all, no AI can predict the real world just yet - in fact, that’s an excellent quantum computing problem that the brightest minds are working on.
Yun-Ta Tsai, an engineer at Tesla’s AI team, also mentioned that writing code or generating scenarios doesn’t cover what even the wildest AI hallucinations might come up with. There are lots of optical phenomena and real-world situations that don’t necessarily make sense in the rigid training sets that AI would develop, so real-world data is absolutely essential to build a system that can actually train a useful real-world AI.
Tesla has billions of miles of real-world video that can be used for training, according to Tesla’s Social Media Team Lead Viv. This much data is essential because even today, FSD encounters “edge cases” that can confuse it, slow it down, or render it incapable of continuing, throwing up the dreaded red hands telling the user to take over.
Cosmos was trained on approximately 20 million hours of footage, including human activities like walking and manipulating objects. On the other hand, Tesla’s fleet gathers approximately 2,380 recorded minutes of real-world video per minute. Every 140 hours - just shy of 6 days - Tesla’s fleet gathers 20 million hours of footage. That was a little bit of back-of-the-napkin math, calculated at 60 mph as the average speed.
Generative Worlds
Both Tesla’s FSD and Nvidia’s Cosmos can generate highly realistic, physics-based worlds. These worlds are life-like environments and simulate the movement of people and traffic and the real-life position of obstacles and objects, including curbs, fences, buildings, and other objects.
Tesla uses a combination of real-world data and synthetic data, but the combination of data is heavily weighted to real-world data. Meanwhile, companies who use Cosmos will be weighting their data heavily towards synthetically created situations, drastically limiting what kind of cases they may see in their training datasets.
As such, while generative worlds may be useful to validate an AI quickly, we would argue that these worlds aren’t as useful as real-world data to do the training of an AI.
Overall, Cosmos is an exciting step - others are clearly following in Tesla’s footsteps, but they’re extremely far behind in real-world data. Tesla has built a massive first-mover advantage in AI and autonomy, and others are now playing catch-up.
We’re excited to see how Tesla’s future deployment of its Dojo Supercomputer for Data Labelling adds to its pre-existing lead, and how Cortex will be able to expand, as well as what competitors are going to be bringing to the table. After all, competition breeds innovation - and that’s how Tesla innovated in the EV space to begin with.
Last night, Tesla released software update 2024.45.25.15, which includes FSD V12.6.1. This update adds support for all HW3 vehicles, including the Model 3 and Model Y. We’re excited to see the continued support for HW3 owners.
FSD V12.6.1
V12.6.1 is now going wide, according to Ashok Elluswamy, Tesla’s VP of AI. This update is going to the Model 3 and Model Y for the first time - as only the Model S and Model X were included in FSD V12.6.
V12.6 is a big step forward for HW3 - it includes End-to-End on Highway, Improved City Streets Behavior, and Smoother and More Accurate Tracking - all contributing towards a better, smoother, and more comfortable build of FSD. You can read our comparison between FSD V12.6 and V13.2.2 here.
In short, FSD V12.6 performs considerably closer to V13 than V12.5.4.2 - which is a massive improvement. It performs as well as the Cybertruck version of FSD V13, which is still missing a few features when compared to other HW4 vehicles, but it’s a great sign for HW3. A lot of the improvements can be pointed to in the improvements to lane selection and decision-making - the vehicle tends to hesitate far less on V12.6, meaning the ride is a lot smoother. Many early V12.6 testers mentioned that it felt more like V13-mini than anything else.
Legacy Model S & X
We haven’t seen this update hit any legacy Model S and Model X vehicles just yet. We’re not sure whether Ashok’s statement of “generally” applies here - but it should. If you do get the update, please let us know.
Legacy Model S and Model X vehicles are still on an older FSD build and potentially won’t see another FSD update for a little while longer. While they do have the same FSD hardware as other vehicles, there are enough hardware differences that require a build specifically for these vehicles.
FSD V12.6.1 is going out now to the redesigned Model S and X with HW3 and all Model 3 and Model Y vehicles with HW3. The initial wave went out last night, and we expect to see more later today or tomorrow. If this release ends up going “wide,” we should see much larger waves go out next week.