Nearly a year after Tesla started testing its controversial “Full Self-Driving” (FSD) beta software with a select group of customers, the company’s CEO Elon Musk said he is aiming for a wider release by the end of September. The news comes as an older version of the software leaked online.
Musk said on Twitter that the company would begin rolling out FSD version 10 to customers in its early access program at midnight on September 10th. Then, the software “will need another few weeks after that for tuning [and] bug fixes,” which Musk estimated would take four weeks. At that point, a “public beta button” will be made available to more Tesla customers, which is expected to take the form of a download button for people who purchased the FSD package.
Naturally, this should all be taken with a giant grain of salt. Musk has been promising a wider release of the beta software for those customers who purchased the FSD package (which currently costs $10,000) for a while now. It’s possible that Tesla will blow past this deadline and Musk will tweet a new date to get customers excited.
Just to give you a sense of how long this has been going on: In 2018, Musk said that the “long awaited” version of FSD would begin rolling out in August of that year, which didn’t happen. He did it again in 2019, proclaiming that “a year from now” there would be “over a million cars with full self-driving, software, everything.” That also didn’t happen. The company actually began shipping FSD version 9 in July, but only to members of its early access program.
So to say that Tesla fans have been anticipating this update for a while would be an understatement. Some customers have gotten sick of waiting and sued the company for false advertising.
Tesla says the software controls the vehicle’s steering, lane centering, braking, and acceleration on highways and city streets. But it is still considered a Level 2 advanced driver assist system because it requires driver supervision at all times. Studies have shown that this supervisory role can make it more difficult for drivers to stay vigilant on the road, which can be dangerous. The driver is also legally responsible for the vehicle, which some say undercuts Tesla’s marketing of its product as “full self-driving.”
There’s no question that Tesla is more willing than its competitors to test beta versions of its Autopilot driver assist feature on its customers in the interest of gathering data and working out any bugs in the system. And Tesla customers are mostly fine with this, routinely flooding Musk’s mentions begging to be admitted into the company’s early access program for beta testers. This has helped contribute to Tesla’s public reputation as a leader in autonomous driving, despite its vehicles continuously falling short of what most experts would agree defines a self-driving car.
Meanwhile, an older version of Tesla’s FSD software has leaked and is currently spreading among the hacker community, according to Electrek. Citing unnamed sources, the website reported that binary firmware files of the software are being passed around in the Tesla root access community. According to SSH.com:
Having root access generally means being able to log into some root account on the server, or being able to run commands as root on the server, for example by using some privilege escalation tool such as sudo.
Some Tesla owners have used root access to examine the company’s software releases and access some unreleased features. With root access, you can actually run the software on your vehicle. According to Electrek, the hacker community generally tries to keep things quiet so as not to alarm Tesla.
However, a Tesla owner in Ukraine recently posted a video using version 8.2 of FSD beta in Kiev, where Tesla hasn’t released it. Tesla has only been developing FSD for the US market and hasn’t adapted it for use in other countries, which have different road signs and driving rules. As noted by hacker @greentheonly, this is an example of how Tesla’s FSD works in a location without “decent maps.”
[embedded content]
It’s also a helpful reminder of what happens when you neglect to use high-definition maps to underpin your self-driving software. Unlike autonomous vehicle companies like Waymo and Cruise, Tesla does not use HD maps, nor does it restrict its software from being used in certain areas, also known as geofencing.
Musk has said he is trying to create an autonomous vehicle technology that relies on vision-based sensors like cameras and a neural net-trained software. He has scoffed at the AV industry’s reliance on sensors like lidar, which uses lasers to identify nearby objects and which other companies say provides needed redundancy in case of failures.
The US government has taken a renewed interest in Tesla, recently announcing that it was investigating incidents involving Tesla cars operating Autopilot that have crashed into parked emergency vehicles.