Elon Musk started Tesla's AI Day 2022 by saying, "I want to set some expectations with respect to our Optimus Robot," just before the doors opened behind him. A robot walked out, waved at the audience, and did a little dance. Admittedly a humble beginning, he explained, "the Robot can actually do a lot more than what we just showed you. We just didn't want it to fall on its face." Musk's vision for the Tesla Robot, "Optimus is going to be incredible in five years, ten years mind-blowing." The CEO said other technologies that have changed the world have plateaued; the Robot is just starting.
Tesla's CEO envisions Optimus eventually being like Commander Data, the android from Star Trek the Next Generation, except it "would be programmed to be less robot-like and more friendly." Undoubtedly there is a long way to go to achieve what Doctor Noonien Soong created in Star Trek TNG. What was demonstrated onstage wasn't at that level, but several videos throughout the presentation highlighted what the Robot is capable of at its very early stage in development. The audience watched the Robot pick up boxes, deliver packages, water plants and work at a station at the Tesla factory in Fremont.
Development over 8 Months
The breakdown of some of the systems of the Tesla Robot
Tesla (Edited by Not a Tesla App)
The first Robot to take the stage at AI Day was not Optimus, but Bumble C, another acknowledgement to The Transformers, as Bumble Bee played a significant role in that franchise. However, Bumble C is far less advanced than Optimus, who did appear later but was on a cart.
Several Tesla engineers took turns on the microphone describing some of the most complex elements of the project that was first announced one year ago. Perhaps the best description of the project was the company moving from building a robot on wheels to a robot on legs. However, that may be oversimplifying. For example, the car has two motors, and the Robot has 28 actuators.
Overall Design and Battery Life
Tesla's brightest demonstrated how the production has come to life over the past eight months. It seems this group of computer masterminds had to become anatomist experts as Tesla took hints from the human body to create a humanoid robot. That is an essential factor in creating Optimus. Everything people interact with is made usable by a human, with two legs, two arms, ten fingers etc. If the Robot differed from what the world is already designed for, everything would have to change. However, recreating the human body and its countless movements would take far too long, so Tesla has stripped it down to less than 30 core movements, not including the hand.
Like the human torso contains the heart, the Robot's chest holds the battery. It's projected that a single charge would provide enough for a full day's work with a 2.3-kilowatt-hour battery. All the battery electronics are integrated into a single printed circuit board within the pack. That technology keeps charge management and power distribution all in one place. Tesla used lessons learned from vehicle and energy production to create the battery allowing for streamlined manufacturing and simple and effective cooling methods.
Autopilot Technology
Tesla showed what the Robot sees, and it looked very familiar. That's because the neural networks are pulling directly from Autopilot. Training data had to be collected to show indoor settings and other products not used with the car. Engineers have trained neural networks to identify high-frequency features and key points within the Robot's camera streams, such as a charging station. Tesla has also been using the Autopilot simulator but has integrated it for use with the Robot programming.
Tesla shows off what the Optimus robot sees
Tesla (Edited by Not a Tesla App)
The torso also contains the centralized computer that Tesla says will do everything a human brain does, such as processing vision data, making split-second decisions based on multi-sensory inputs and supporting communications. In addition, the Robot is equipped with wireless connectivity and audio support. Yes, the Robot is going to have conversations, "we really want to have fun, be utilitarian and also be a friend and hang out with you," said Musk.
Motors Mimic Joints
The 28 actuators throughout the Robot's frame are placed where many joints are in the human body. Just one of those actuators was shown lifting a half-tonne nine-foot concert grand piano. There have been thousands of test models run to show how each motor works with the other and how to effectively operate the most relevant actuators for a task. Even the act of walking takes several calculations that the Robot must make in real-time, not only to perform but also appear natural. The robots will be programmed with a locomotion code; the desired path goes to the locomotion planner, which uses trajectories to state estimations, very similar to the human vestibular system.
Human hands can move 300 degrees per second and have tens of thousands of tactile sensors. Hands can manipulate anything in our daily lives, from bulky, heavy items to something delicate. Now Tesla is recreating that with Optimus. Six actuators and 11 degrees of freedom are incorporated into the robot hand. It has an in-hand controller that drives the fingers and receives sensory feedback. The fingers have metallic tendons to allow for flexibility and strength. The hands are being created to allow for a precision grip of small parts and tools.
Responsible Robot Safety
Musk wanted to start AI day with the epic opening scene from Terminator when a robot crushed a skull. He has heard the fears and people warning, "don't go down the terminator path," but the CEO said safety is a top priority. There are safeguards in place, including designs for a localized control ROM that would not be connected to the internet that can turn the Robot off. He sees this as a stop button or remote control.
Optimus Price
Musk said the development of Optimus may broaden Tesla's mission statement to include "making the future awesome." He believes the potential is not recognized by most, and it "really boggles the mind." Musk said, "this means a future of abundance. There is no poverty. You can have whatever you want in terms of products and services. It really is a fundamental transformation of civilization as we know it." All of this at a price predicted to be less than $20,000 USD.
Tesla Shows Off its First Robot at AI Day 2
Subscribe
Subscribe to our newsletter to stay up to date on the latest Tesla news, upcoming features and software updates.
Another quarter has passed, and that means it’s time to submit questions and vote for Tesla’s Q2 2025 Earnings Call. While Q1 was a tough quarter for the company, Q2 saw some recovery in sales, although there’s still some work to be done.
However, there’s always a lot to be excited about during Tesla’s Q&A session, where we usually learn a lot about future software improvements and upcoming vehicles. We may hear more about FSD Unsupervised, Robotaxi, or the more affordable vehicle, or its upcoming larger 6-seater Model Y, the Model Y L. Tesla also mentioned a potential FSD price hike back in the Q1 2025 Earnings Call, so that could be something that is brought up as well.
Tesla’s Q2 So Far
Tesla has already released their Q2 2025 Production and Delivery numbers, which were up from Q1 of this year, but still down compared to Q2 last year.
Production
Deliveries
Model 3/Y
396,835
373,728
Model S, X, and Cybertruck
13,409
10,394
Total
410,244
384,122
How to Submit & Vote
Tesla lets shareholders submit a question that will be voted on and may be answered during the Q&A session. To submit your own question or vote on an already submitted question, you’ll need to be a verified shareholder. You can go to Say’s platform and link your brokerage accounts.
Once it is verified, you’ll be able to log in and vote your shares on your own question, or on someone else’s question.
Here’s the link to get started on Say’s Tesla Q&A. You must submit your questions and votes by July 23rd, 2025, at 4:00 PM EDT.
Top Questions So Far
Unsurprisingly, people have already been submitting questions, and here are the top ones so far.
Can you give us some insight how robotaxis have been performing so far and what rate you expect to expand in terms of vehicles, geofence, cities, and supervisors?
What are the key technical and regulatory hurdles still remaining for unsupervised FSD to be available for personal use? Timeline?
What specific factory tasks is Optimus currently performing, and what is the expected timeline for scaling production to enable external sales? How does Tesla envision Optimus contributing to revenue in the next 2–3 years?
Can you provide an update on the development and production timeline for Tesla’s more affordable models? How will these models balance cost reduction with profitability, and what impact do you expect on demand in the current economic climate?
Are there any news for HW3 users getting retrofits or upgrades? Will they get HW4 or some future version of HW5?
When do you anticipate customer vehicles to receive unsupervised FSD?
And here are some other ones we found interesting:
Have any meaningful Optimus milestones changed for this year or next and will thousands of Optimus be performing tasks in Tesla factories by year end?
Are front bumper cameras going to be necessary for unsupervised full self driving? If so, what is the companies plan to retrofit vehicles that do not have them?
Will there be a new AI day to explain the advancements the Autopilot, Optimus, and Dojo/chip teams have made over the past several years. We still do not know much about the HW4.
Earnings Call Details
Tesla will hold its earnings call on Wednesday, July 23rd, at 4:00 PM EDT. It's still early for an access link, but we’ll make sure we have a link up on the site before the earnings call that day.
If you do miss the earnings call, no worries. We will provide a full recap following the call, and we’ll also do some in-depth dives into what was said and what we know.
Tesla’s Summer Update, 2025.26, has finally launched, bringing with it a batch of interesting new features for some, and a bunch of quality-of-life improvements for everyone else.
Grok AI Assistant
The star of the Summer Update is Grok, xAI’s conversational AI assistant, which has now landed in Tesla vehicles. For now, it's available in any Tesla that has an AMD processor and is potentially coming to Intel-based vehicles in the near future. The feature is also only available in the U.S., but it’s expected to expand to other regions — hopefully soon.
Grok is in its first iteration as an in-vehicle assistant, and for now, cannot control the vehicle, which means that Tesla’s voice command system is still intact. However, there is a lot it can do already. Grok is activated by pressing and holding the voice button (right scroll wheel on older vehicles), while a short press of the button is still reserved for voice commands. Grok will support a wake word in the future, letting you activate it without pressing a button.
You don't need to sign into Grok to use it in your Tesla!
It can't currently control anything in the car, but it does seem context aware that it is in a vehicle. pic.twitter.com/IpatR7sjiJ
Once Grok is open, which can also be done by tapping the Grok app icon, users can tailor the AI personality according to their preferences by selecting a persona and voice of their choice.
There are also several other settings for Grok under the settings button. You can enable NSFW mode, Kids Mode, or disable access to your vehicle’s location.
Grok has contextual awareness of your vehicle location, which means it can provide relevant answers to questions like “Where should I go for dinner?”
Logging In Not Required
Grok is free with Premium Connectivity, or if you’re using your phone’s hotspot feature or connected to WiFi, so anyone can try it for free. In fact, you don’t even need to log in to start using Grok. However, logging in adds some additional features.
If you’d like to log in, you can do so by scanning the QR code in the vehicle, which will provide chat management and transcripts, SuperGrok access (if you pay for a subscription), and better privacy control.
Tesla has added a new Light Sync feature that pulses the vehicle’s ambient lighting in sync with the music being played. This option is turned on under Toybox > Light Sync. There are also a few options, including the ability to match the ambient light colors to the album’s artwork, instead of using your selected color.
In addition, in Park you can enable Rave Cave when parked, which cranks up the ambient lighting brightness to the maximum.
Dashcam App Update
The Dashcam app now allows you to adjust playback speeds, just like the older Dashcam Viewer, which is still used on Intel-based vehicles.
In addition to adjusting playback speed, you can now adjust the video view so that it’s displayed without being obstructed by the buttons at the top (video below). The difference is small, but could be useful if you’re trying to see something slightly out of view or that’s hidden behind the top Dashcam buttons.
While the Cybertruck has also received the updated Dashcam Viewer with this update, it does not have the new B-pillar camera recordings like other HW4 cars.
Sentry Mode is getting one of the best uses of the vehicle’s ambient lighting that we’ve seen so far. The ambient lighting will now slowly pulse red while Sentry Mode is activated to grab someone’s attention, instead of just relying on the vehicle’s display.
While you can disable Sentry Mode sounds, we’d love to see an even more stealthy Sentry Mode that also disables the ambient lighting and screen, allowing the vehicle to record without anyone being aware.
Since the ambient lighting is being used in this case to make people more aware of a feature, Tesla could also use it in other modes, such as Dog Mode.
There are a lot of potential uses for ambient lighting. Tesla can make it glow while the vehicle is charging, with the brightness potentially related to the vehicle's charge level.
When you navigate to a Supercharger, new icons in the charger list will indicate locations that require valet service or pay-to-park access.
Upon arrival at the location, a notification will appear on your screen, displaying important details such as access codes, parking restrictions, level/floor information for parking garages, and restroom availability. This information will also be available on the site card in the navigation.
Equalizer Presets
max_bracco/X
Tesla has moved the audio settings from the music player, directly into the vehicle settings — making them much easier to find. In addition to creating a new “Audio” section in settings, you can now create and save equalizer presets.
Each preset can have a name, custom EQ settings, and a setting for immersive audio.
max_bracco/X
Onboarding Guide
Not a Tesla App
Tesla has introduced a new Onboarding Guide for new owners. The guide covers driver settings, touchscreen use, steering wheel and seat setup, and how to control key portions of the vehicle, including lights, wipers, and Autopilot features.
The Onboarding Guide is automatically initiated when a new owner accepts delivery of a Tesla, or can be manually initiated at any time by going to Controls > Service > Onboarding Guide.
This appears to only be available for the new Model 3 and new Model Y.
In typical Tesla fashion, this 2025.26 update is rolling out gradually in small waves initially. Three waves have already gone out, so all signs are looking good for it going into a wide release soon.