Rimac announces the Verne robotaxi, which looks similar to Tesla’s concepts
MotorTrend
Rimac, the company behind the Rimac Nevera electric hypercar, has announced that it intends to produce a robotaxi, and it looks quite similar to Tesla’s concepts. Much of what we’ve heard about Tesla’s upcoming robotaxi, the Cybercab, is featured in Rimac’s autonomous vehicle. From the two seats to the airy interior and the center-screen-focused interior, it’s all here, although there are significant differences as well. Rimac’s prototype, called Verne, was revealed on Wednesday, June 26th.
Verne Robotaxi
Verne will include a 43
MotorTrend
The Verne is expected to begin operation in 2026 and is a two-seater robotaxi using Mobileye’s LiDAR technology. The vehicle is expected to be a level 4 autonomous vehicle, which means it would still require remote support for handling complex situations, similar to Waymo’s work in San Franciso.
The Verne has a 43” display, and 17 speakers, and is supposedly designed to emulate “a room on wheels”, with an inside-out design concept. Interestingly, rather than regular doors, the Verne has doors that swing forward horizontally, along with a keypad-based entry system.
A smaller screen between the front seats lets you control certain aspects of the vehicle
MotorTrend
Rimac says they have signed agreements to launch in 11 cities in the EU, the UK, and the Middle East. They have also mentioned they are negotiating contracts with 30 more cities worldwide.
Rimac also showed off images of its robotaxi app and a concept building for its robotaxis – presumably a charging and service hub.
The verne will feature sliding doors, a lot like a minivan
MotorTrend
Comparing Rimac’s Robotaxi to Tesla’s
Although Tesla has yet to reveal the Cybercab, there are several things Tesla has already talked about for their upcoming robotaxi. One key difference between Rimac’s vision and Tesla’s is that Tesla appears to be chasing the cheapest possible transport, with Tesla previously touting ride prices that would rival bus ticket prices. While Rimac appears to focus more on an ideal experience. While everyone loves extra luxury, at the end of the day, price usually wins.
The Rimac robotaxi app
MotorTrend
One example is Tesla’s single center screen, compared to Rimac’s two screens. In addition to the viewable 43” center display, which presumably is not a touch-screen, Rimac has a separate screen and controls between both passenger seats. Tesla’s approach appears to focus on a single screen, with the user controlling much of the car’s control such as music and climate through Tesla’s robotaxi app.
Another example is Rimac’s idea of including an entry pad and screen on the outside of the vehicle for passenger to be able to unlock the vehicle. Tesla’s approach to unlocking a vehicle is expected to rely on temporary keys that are tied to user’s phones leveraging ultra wideband, a lot like how Tesla’s phone keys work today on newer vehicles.
Tesla’s approach to autonomy is also drastically different than Mobileye’s, which relies on radar, LiDAR and more cameras than Tesla’s Autopilot suite today.
Viability
This announcement from Rimac is a bit of an oddity. As a company, Rimac has produced less than 150 vehicles in their short lifespan – all hand-designed and hand-produced Rimac Nevara hypercars. Their ability to scale to produce more than a handful of these Verne robotaxis, while visually appealing, is questionable at best.
On the same front, Rimac recently received a $200M Euro grant from the EU as part of a package to develop an economic recovery plan for Croatia. Rimac has also received $80M Euros in funding from Hyundai and Kia – but that was to collaborate on a high-performance fuel cell electric vehicle, and a high-performance EV sports car.
The exterior of the Verne robotaxi
MotorTrend
Beyond that, Rimac has never done any work with autonomy – the self-driving tech that is running the Verne is entirely based on the outsourced work from Mobileye. It seems that the Verne will serve as Mobileye’s real-life test on whether its technology can be integrated into a Robotaxi platform on its own.
Tesla previously used Mobileye’s technology for its own autonomy during its inception years (AP 1) but quickly moved on towards using its own vision-based camera tech instead.
The Rimac robotaxi app
MotorTrend
The interior of the Verne
MotorTrend
Subscribe
Subscribe to our newsletter to stay up to date on the latest Tesla news, upcoming features and software updates.
Last week, Mark Ruber, an engineering YouTuber best known for his glitter bombs, released a video where he tested Tesla's Autopilot against various conditions - including the iconic ACME painted wall.
During this test, many people noted that Mark was using Autopilot rather than FSD, even though his video was titled “Can you Fool a Self-Driving Car?”. The Tesla on Autopilot went up against a vehicle equipped with Luminar’s LIDAR rig, running some sort of basic autonomy or safety software.
New Video Tests FSD
Many people were disappointed with Mark’s video and his testing methods, so several creators got to work to actually test out Tesla’s FSD.
Creator Kyle Paul over on X made a much better follow-up video, using both a HW3 Model Y as well as an AI4 Cybertruck. In a relatively unsurprising turn of events, the Cybertruck was successfully able to detect the wall, slowed down, and came to a stop. The Cybertruck was running FSD 13.2.8.
Kyle’s team did a fantastic job building the wall and testing this in a private area using FSD rather than Autopilot. On top of that - they re-tested the results several times and recorded the entire thing in and out. While Mark’s video was more for entertainment, Kyle really set out to prove what would really happen in this unlikely scenario.
Sadly, the HW3 Model Y was unable to detect the wall, and manual intervention was required in each test. While the Model Y was running FSD 12.5.4.2 rather than an FSD V12.6 build, we don’t expect this to have had a significant impact on the test - this is more of an issue with how computer vision analyzes the environment.
There are several major differences between HW3 and HW4. The first is obviously that the version that runs on AI4 is more advanced, as the hardware is capable of processing a lot more data. However, AI4 also features much higher-resolution cameras than HW3, and Tesla recently added the ability for the video feeds to be processed at full resolution on FSD V13. This could have made the difference, although it’s not entirely clear. Perhaps if HW3 gets a version of FSD V13 in the future, HW3 can be retested to see if it passes the “ACME wall” test.
Watch
Kyle’s entire video is below. It’s only 10 minutes long, so definitely give it a watch. Props to Kyle on the quick and thorough execution.
What Does This Mean for FSD?
We broke down Mark’s test - and examined all the little issues that we discovered after doing some in-depth research - you can read our analysis here.
Putting aside the issues with Mark’s testing and instead using the new results - it seems that if you were to have to fight against Wile-E-Coyote and his ACME tools with your Tesla, cartoon logic may win if you’re on an HW3 vehicle. If you’re on an AI4 vehicle, you’ll likely come to a safe stop.
Vehicle depth perception is definitely something that Tesla has been hard at work to improve - and some fairly drastic improvements came with FSD V13 that haven’t been entirely translated to FSD V12 just yet. Future versions of HW3 FSD may be able to determine that the wall is there successfully. So Kyle - if you’re reading this - don’t get rid of that wall. We’d love to see more testing in the future.
However, this entire test scenario is so out of left field… there is a good likelihood this same test would fool some human drivers as well. The most important part is that the future of autonomy will not fall for these tricks, so it's very unlikely for someone to weaponize this idea as it’d only possibly work on a small segment of vehicles.
If Wile-E-Coyote is after you, someone else may drive into the wall before your Tesla does.
Not a Tesla App
We’re not kidding, this really happened already. This isn’t a realistic scenario outside of someone trying to play an insane prank - but it’s good to know that FSD V13 is capable of dodging this.
Tesla regularly holds quarterly all-hands meetings for employees, but last night marks the first time Tesla has live-streamed the event for the public.
The meeting primarily focused on Tesla’s employees, recent achievements, and the future of Tesla. While it didn’t reveal much new information, it was interesting to see Elon Musk candidly engaging with his teams, who seem to genuinely enjoy working with him. Still, there were a few noteworthy takeaways.
As with Tesla’s Earnings Calls and other live events, we’ve put together a concise, easy-to-digest recap of everything discussed.
General Points
Work-related Injuries declined over time
Planning to expand to new markets
Cell Manufacturing
Continuing to invest in battery supply
Cheapest, lowest cost per kWh cell
The supercharger network continues to grow
Vehicle range and charging speed should match the needs of humans and their required time for breaks