Tesla has released vision Park Assist with Tesla update 2023.6.9
@EVBaymax & @ManZoneBeer
Tesla has unveiled the Vision Park Assist feature with its new software update, version 2023.6.9 for non-FSD Beta vehicles. This cutting-edge feature employs the car's cameras to measure distances to nearby objects, offering users valuable parking assistance.
When Tesla removed ultrasonic sensors (USS) from their vehicles six months ago, some owners expressed concerns regarding the loss of parking assistance. In response, Tesla embarked on the transition towards a vision-based solution, culminating in the introduction of Vision Park Assist.
Accuracy of Park Assist
Twitter user @EVBaymax couldn’t wait till morning to test out the new Vision Park Assist feature. Equipped with his Model 3 and a measuring tape, @EVBaymax put the new technology to the test and shared it all on Twitter, providing some valuable insight into its performance. In one video, he said, “super-impressive what Tesla has been able to do. This is… Wow! I’m impressed.” @EVBaymax was showing the car within an inch or two of what the reading said inside the car.
However, he did spot something less impressive. When shifting into drive or reverse after being parked for a few minutes, a message pops up that says: Park Assist is Loading. That load took 6-8 seconds as the system recalls what was around it before it was parked. The time is quite a lag compared to the USS-enabled systems. @EVBaymax is hopeful this is addressed. However, the vehicle did eventually load the data it had before it was turned off, showing the same distance to the curb that was in front of it, even though the curb was out of view of the cameras.
Several online videos show a significant difference in readings between USS and Vision. USS mostly displays smooth readings with straight edges, but vision does not display many straight lines. When backing up to a curb, @EVBaymax notes that the line representing the curb is “squiggly and is moving.”
Availability
Although Park Assist was initially included in FSD Beta 11.3.2 and limited to North American markets, Tesla is rolling out Park Assist to additional markets with update 2023.6.9.
Currently, the Vision Park Assist feature is compatible with Model 3 and Model Y vehicles. Users also have the option to turn off Park Assist if they prefer, just like owners with USS. This innovative technology offers 360-degree detection, instead of just front and rear, as highlighted in our previous article.
Park Assist Detecting a Curb
One of the advantages of vision-based Park Assist is the ability to detect objects on the side of the vehicle. @EVBaymax does a great job illustrating that in this video below.
At this time, it appears that vehicles with ultrasonic sensors still offer a higher level of accuracy, however that could depend on the height of the object and the type of object itself.
Vision Park Assist does not currently apply to vehicles with ultrasonic sensors. However, since Vision Park Assist does provide some advantages over its hardware-based version, it'll be interesting to see if Tesla incorporates it into all vehicles in the future as the feature matures.
As more Tesla owners install and utilize Vision Park Assist, the feature is expected to improve. The company will use the collected data to enhance distance estimates, aiming for accuracy on par with sensor-based systems.
Tesla's Vision Park Assist offers visual and auditory alerts for objects in the vehicle's surroundings, utilizing the occupancy network to generate high-definition object outlines. However, it is essential to remember that this feature should be treated as guidance, not as a substitute for an attentive driver.
Tesla recently showed off a demo of Optimus, its humanoid robot, walking around in moderately challenging terrain—not on a flat surface but on dirt and slopes. These things can be difficult for a humanoid robot, especially during the training cycle.
Most interestingly, Milan Kovac, VP of Engineering for Optimus, clarified what it takes to get Optimus to this stage. Let’s break down what he said.
Optimus is Blind
Optimus is getting seriously good at walking now - it can keep its balance over uneven ground - even while walking blind. Tesla is currently using just the sensors, all powered by a neural net running on the embedded computer.
Essentially, Tesla is building Optimus from the ground up, relying on as much additional data as possible while it trains vision. This is similar to how they train FSD on vehicles, using LiDAR rigs to validate the vision system’s accuracy. While Optimus doesn’t have LiDAR, it relies on all those other sensors on board, many of which will likely become simplified as vision takes over as the primary sensor.
Today, Optimus is walking blind, but it’s able to react almost instantly to changes in the terrain underneath it, even if it falls or slips.
What’s Next?
Next up, Tesla AI will be adding vision to Optimus - helping complete the neural net. Remember, Optimus runs on the same overall AI stack as FSD - in fact, Optimus uses an FSD computer and an offshoot of the FSD stack for vision-based tasks.
Milan mentions they’re planning on adding vision to help the robot plan ahead and improve its walking gait. While the zombie shuffle is iconic and a little bit amusing, getting humanoid robots to walk like humans is actually difficult.
There’s plenty more, too - including better responsiveness to velocity and direction commands and learning to fall and stand back up. Falling while protecting yourself to minimize damage is something natural to humans - but not exactly natural to something like a robot. Training it to do so is essential in keeping the robot, the environment around it, and the people it is interacting with safe.
We’re excited to see what’s coming with Optimus next because it is already getting started in some fashion in Tesla’s factories.
In a relatively surprising move, GM announced that it is realigning its autonomy strategy and prioritizing advanced driver assistance systems (ADAS) over fully autonomous vehicles.
GM is effectively closing Cruise (autonomous) and focusing on its Super Cruise (ADAS) feature. The engineering teams at Cruise will join the GM teams working on Super Cruise, effectively shuttering the fully autonomous vehicle business.
End of Cruise
GM cites that “an increasingly competitive robotaxi market” and “considerable time and resources” are required for scaling the business to a profitable level. Essentially - they’re unable to keep up with competitors at current funding and research levels, putting them further and further behind.
Cruise has been offering driverless rides in several cities, using HD mapping of cities alongside vehicles equipped with a dazzling array of over 40 sensors. That means that each cruise vehicle is essentially a massive investment and does not turn a profit while collecting data to work towards Autonomy.
Cruise has definitely been on the back burner for a while, and a quick glance at their website - since it's still up for now - shows the last time they officially released any sort of major news packet was back in 2019.
Competition is Killer
Their current direct competitor - Waymo, is funded by Google, which maintains a direct interest in ensuring they have a play in the AI and autonomy space.
Interestingly, this news comes just a month after Tesla’s We, Robot event, where they showed off the Cybercab and the Robotaxi network, as well as plans to begin deployment of the network and Unsupervised FSD sometime in 2025. Tesla is already in talks with some cities in California and Texas to launch Robotaxi in 2025.
GM Admits Tesla Has the Right Strategy
As part of the business call following the announcement, GM admitted that Tesla’s end-to-end and Vision-based approach towards autonomy is the right strategy. While they say Cruise started down that path, they’re putting aside their goals towards fully autonomous vehicles for now and focusing on introducing that tech in Super Cruise instead.
NEWS: GM just admitted that @Tesla’s end-to-end approach to autonomy is the right strategy.
“That’s where the industry is pivoting. Cruise had already started making headway down that path. We are moving to a foundation model and end-to-end approach going forward.” pic.twitter.com/ACs5SFKUc3
With GM now focusing on Super Cruise, they’ll put aside autonomy and instead focus solely on ADAS features to relieve driver stress and improve safety. While those are positive goals that will benefit all road users, full autonomy is really the key to removing the massive impact that vehicle accidents have on society today.
In addition, Super Cruise is extremely limited, cannot brake for traffic controls, and doesn’t work in adverse conditions - even rain. It can only function when lane markings are clear, there are no construction zones, and there is a functional web connection.
The final key to the picture is that the vehicle has to be on an HD-mapped and compatible highway - essentially locking Super Cruise to wherever GM has time to spend mapping, rather than being functional anywhere in a general sense, like FSD or Autopilot.
Others Impressed - Licensing FSD
Interestingly, some other manufacturers have also weighed into the demise of Cruise. BMW, in a now-deleted post, said that a demo of Tesla’s FSD is “very impressive.” There’s a distinct chance that BMW and other manufacturers are looking to see what Tesla does next.
BMW chimes in on a now-deleted post. The Internet is forever, BMW!
Not a Tesla App
It seems that FSD has caught their eyes after We, Robot - and that the demonstrations of FSD V13.2 online seem to be the pivot point. At the 2024 Shareholder Meeting earlier in the year, Elon shared the fact that several manufacturers had reached out, looking to understand what was required to license FSD from Tesla.
There is a good chance 2025 will be the year we’ll see announcements of the adoption of FSD by legacy manufacturers - similar to how we saw the surprise announcements of the adoption of the NACS charging standard.