Tesla FSD will be pure vision and not rely on radar use

By Nuno Cristovao

Elon tweeted that v9 of the FSD beta would remove its reliance on radar completely and instead determine decisions based purely on vision. Humans don’t have radar after all, so it seems like a logical solution and tells us Tesla is feeling much more confident in their vision AI.

Tesla Vision AI

Radar and vision each have their advantages, but radar has thus far been much more reliable in detecting objects and determining speed. If you’ve ever noticed your Tesla being able to detect two vehicles in front of you when you can only see the one directly ahead of you, that’s radar at work.

In this situation the radio waves from the radar sensor are bouncing underneath the car in front of you and are able to continue traveling and detect that there is another object ahead even though it could never “see” it.

It really is one of those wow moments where you can feel the future and the ability for AI-powered cars to drive better than humans one day. It’s baby steps and slowly we’ll see more and more of these situations where the vehicle simply sees or does something we could never do.

There’s no doubting that more sensors could provide a more reliable and accurate interpretation of the real world as they each have their own advantages. In an ideal world a vehicle with radar, lidar, vision, ultrasonic sensors and even audio processing would provide the best solution. However, more sensors and systems come at a price, resulting in increased vehicle cost and system complexity.

After all humans are relatively safe drivers with two “cameras” and vision alone. If Tesla can completely solve vision, they’ll easily be able to achieve superhuman driving capabilities. Teslas have eight cameras, facing in all directions. They’re able to analyze all of them concurrently and make much more accurate interpretations then we ever could in the same amount of time.

Tristan on Twitter recently had some great insight into Tesla vision AI and how they’re going to replace radar. Here’s what Tristan had to say:

"We recently got some insight into how Tesla is going to replace radar in the recent firmware updates + some nifty ML model techniques

From the binaries we can see that they've added velocity and acceleration outputs. These predictions in addition to the existing xyz outputs give much of the same information that radar traditionally provides (distance + velocity + acceleration).

For autosteer on city streets, you need to know the velocity and acceleration of cars in all directions but radar is only pointing forward. If it's accurate enough to make a left turn, radar is probably unnecessary for the most part.

How can a neural network figure out velocity and acceleration from static images you ask?

They can't!

They've recently switched to something that appears to be styled on an Recurrent Neural Network.

Net structure is unknown (LSTM?) but they're providing the net with a queue of the 15 most recent hidden states. Seems quite a bit easier to train than normal RNNs which need to learn to encode historical data and can have issues like vanishing gradients for longer time windows.

The velocity and acceleration predictions is new, by giving the last 15 frames (~1s) of data I'd expect you can train a highly accurate net to predict velocity + acceleration based off of the learned time series.

They've already been using these queue based RNNs with the normal position nets for a few months presumably to improve stability of the predictions.

This matches with the recent public statements from Tesla about new models training on video instead of static images.

To evaluate the performance compared to radar, I bet Tesla has run some feature importance techniques on the models and radar importance has probably dropped quite a bit with the new nets. See tools like https://captum.ai for more info.

I still think that radar is going to stick around for quite a while for highway usage since the current camera performance in rain and snow isn't great.

NoA often disables in mild rain. City streets might behave better since the relative rain speed is lower.

One other nifty trick they've recently added is a task to rectify the images before feeding them into the neural nets.

This is a common in classical CV applications so surprised it only popped up in the last couple of months.

This makes a lot of sense since it means that the nets don't need to learn the lens distortion. It also likely makes it a lot easier for the nets to correlate objects across multiple cameras since the movement is now much more linear.

For more background on LSTMs (Long Short-Term Memory) see https://towardsdatascience.com/illustrated-guide-to-lstms-and-gru-s-a-step-by-step-explanation-44e9eb85bf21

They're tricky to train because they need to encode history which is fed into future runs. The more times you pass the state, the more the earlier frames is diluted hence "vanishing gradients".

Tesla’s FSD beta v9 will be a big improvement forward from what FSD beta users have been using where the system was still relying on radar. And it’ll be an even bigger leap from what non-beta testers currently have access to. We can’t wait. Now where’s that button?

Tesla Introduces ‘Pay Later’ Option for Tesla Service Invoices in North America

By Karan Singh
@TESLA_winston

Tesla recently introduced Buy Now, Pay Later (BNPL) payment options in the United States and Canada for items in the Tesla Shop, letting owners pay later for new vehicle accessories.

However, with Tesla app update 4.46, they’re expanding support to a much more critical area — Tesla Service. Qualifying owners in the U.S. and Canada will now be able to use Pay Later options for service performed by Tesla, which includes maintenance or repairs.

Services are provided by Affirm and Klarna, as per Mark Fonte, a Senior Software Engineer working on the Tesla app.

Tesla app update 4.46 also added improvements to Tesla Assist, Wall Connector details, Tesla Energy ownership changes, and added visualizations for the updated Model S and Model X.

How It Works

The new feature is seamlessly integrated into the existing service workflow within the Tesla app.

On the Service Estimate, before work on the vehicle begins, you will see a new message on the estimate screen: Pay over time - see if you qualify.

Tapping this link opens the payment calculator, which allows you to view potential payment structures and monthly costs. This provides a clear picture of what a payment plan would look like before you commit to servicing your vehicle.

Additionally, after service is complete and you are ready to pay, the final payment screen will present Affirm and Klarna (region-dependent) as selectable payment methods, alongside the usual options of Tesla Credit or your primary payment card.

Tapping Affirm or Klarna here will reopen the payment calculator, and a confirmation prompt will appear before selecting either BNPL option.

Service Now, Pay Later

Overall, the integration of BNPL providers for service is a thoughtful one for vehicle owners. The terms can vary widely, so it’s important to compare them to other payment options you may have access to. The additional financial flexibility, when faced with a large repair bill, allows more owners to get their vehicle professionally and properly serviced by Tesla.

For those getting larger work done, such as high-voltage battery pack replacements, this is an excellent option to spread payments over a longer period, helping reduce the burden of vehicle repair.

Tesla Robotaxi: A Breakdown of Its New FSD Abilities

By Karan Singh
Not a Tesla App

With the launch of Tesla’s Robotaxi Network, we didn't just get a peek into the future of transportation—we got a detailed look at the next version of FSD.

Videos from early access riders revealed some additional capabilities over current public FSD builds, showing off how it handles emergency vehicles and more.

Safety First for First Responders

One of the biggest changes in FSD’s capabilities is its improved handling of emergency vehicles. During a ride in Austin, Robotaxi is seen identifying an approaching ambulance using a combination of visual and audio data, activating its turn signal, and smoothly pulling over to the side of the road to let the ambulance by (video below).

This is a driving task that requires more than simple awareness of laws. It requires reasoning skills to determine where to move the vehicle to create a safe path, as well as the ability to quickly identify an ambulance or another emergency service vehicle with its sirens and lights activated. Understanding the context and executing a safe and predictable maneuver is crucial, as a wrong maneuver could actually make matters worse.

For FSD and Robotaxi to gain both public trust and regulatory approval, this skill is non-negotiable, and Tesla demonstrated its advancements right here. It’s not surprising Tesla added this ability before Robotaxis made it to public roads.

This is a feature that Tesla previously mentioned would arrive as part of future updates to FSD V13, so expect it in future customer builds as well.

Automated Camera Cleaning

How does a fleet of Robotaxis keep its eyes clean without constant human intervention? Well, a clever new feature that Tesla has previously hinted at in their FSD release notes provides the answer. Robotaxi can now trigger a specific wiper and washer fluid sequence designed to clean the main front-facing cameras.

This might seem like a small detail, but it’s a brilliant solution to one of Tesla’s primary challenges - maintaining sensor clarity. While the vehicle could simply wipe the windshield multiple times, this is a clever solution to clean the most important area of the windshield as thoroughly as possible by focusing extra wiper fluid and wipes on that area.

Complex Maneuvers

Two areas where current builds of FSD V13.2.9 sometimes show hesitation are U-turns and navigating busy parking lots. The latest Robotaxi build appears to improve on both of these areas.

This first video shows a Robotaxi performing a flawless U-turn with no hesitation, and then smoothly switching lanes to take a turn.

Another video on X shows FSD’s updated confidence in navigating a complex parking lot for a precise drop-off. Today’s builds can sometimes struggle in parking lots, being slow and overly cautious when not needed, or too confident elsewhere. This appears to have been improved in these Robotaxi FSD builds with improved path planning and confidence.

We’re also likely to see FSD begin to handle more complex destination options, including parking garages and driveways, which have been promised features for almost a year. The Robotaxi FSD build has also gained the ability to safely pull over on a road, similar to the ambulance example above, but it uses this capability to drop off and pick up passengers. This is a feature that was mentioned in FSD v13.2’s Upcoming Improvements section.

Better Nighttime Performance

Driving at night presents additional challenges, including headlight glare and reduced visibility. The latest version of FSD appears to handle it with almost the same grace as it does during the day. Remember that Tesla’s Robotaxis are available up until midnight.  Early access riders mentioned that FSD is far smoother and is a step up from the behavior of current FSD builds.

Human Support

Now, what happens when a passenger feels unsafe or has a critical question? Tesla has placed two key buttons on the rear screen for just those purposes. Users are given control over the ability to Call Support, which almost instantly connects them with a real human agent at Tesla’s Robotaxi Operations Center via video call.

While it isn’t a fundamental driving feature, it does mean that Tesla’s team can provide support to Robotaxi vehicles remotely, like issuing directive commands to have a vehicle proceed straight, rather than attempting to turn through a gated community.

The other option, Pull Over, allows a rider to immediately request the vehicle to safely pull over, which it will do when it can find a safe and open location. At this point, you can either continue your trip or get out of the Robotaxi.

Both options prompt you with an “Are You Sure?” button before letting you continue, which means you won’t have your Robotaxi ride come to an abrupt stop if you tap the ‘Pull Over’ button by accident.

What This Means for Tesla Owners

These features are likely to be included in future FSD builds. This is essentially the new benchmark by which to judge FSD, at least once it begins rolling out to customer vehicles.

Many of the core driving improvements, such as the more confident maneuvering and emergency vehicle response, will make their way to the wider fleet in upcoming FSD updates.

Remember - Robotaxi isn’t just a service, it is also a preview of Tesla’s driverless FSD builds.

View All Upcoming Features

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

More Tesla News

Tesla Videos

Latest Tesla Update

Confirmed by Elon

Take a look at features that Elon Musk has said will be coming soon.

Subscribe

Subscribe to our weekly newsletter