Tesla‘s Full Self-Driving (FSD) Beta 9 is put to the test in the crowded streets of San Francisco in a recent YouTube video, but it appears to still have a few kinks to work out. Self-driving in general is a tricky concept and, with so many different scenarios and thousands of things that could go wrong, the technology has to account for a lot of variables. When put to the test in real life, a lot of these issues and bugs are easy to spot and can potentially cause serious harm.
Tesla has been working on FSD for some time now, releasing beta updates and incrementally adjusting the technology. The technology has since become widely available to Tesla owners, depending on the model they own and what kind of Autopilot system is installed. Users who have the Basic Autopilot system can pay $199 a month for FSD, or those who have the Advanced Autopilot hardware installed can pay $99 monthly. Alternatively, FSD can be added outright for $10,000. Either way, with FSD installed and ready to go, a Tesla can have some admittedly pretty neat features. Unfortunately, it doesn’t look ready or safe for the road just yet, even though users can grab the FSD Beta and give it a go.
In a video, YouTuber AI Addict showcased the new Beta 9 update to FSD in action on the often complicated streets of San Francisco. Right off the bat, the Tesla Model 3 swerves into a Bus Only lane. Then, when entering a turning lane, the car can be seen swerving into traffic after detecting an SUV on the right side. The video also shows multiple cases of Autopilot disengaging after the car suddenly decides to let the driver take over. At one point, a bush gets roughed up by the side of the Tesla, even though it should have been considered an obstruction and avoided – something radar might have picked up. The video shows more than a few interesting incidents that could have all ended very poorly.
How FSD Works & What It Processes
Tesla’s FSD – or any form of self-driving for that matter – has to process thousands of pieces of information at one time. Whether or not that information is correctly processed can be the difference between a near-miss and a collision. Obviously, the video is full of near-misses and without the quick reactions of the driver, the outcomes could have been worse. For FSD Beta 9, the processing relies on Tesla Vision cameras instead of sensors on the Model Y and Model 3. These cameras can determine whether or not the vehicle in front has its brake lights on, and can distinguish between red, yellow and green traffic signals. The cameras also pick up uncommon objects in the way, like construction signs or pedestrians in the road, in order to take caution around them. Although the Tesla Model 3 does seem to do most of this well enough, even a single mistake can be one too many.
Of course, it’s a beta and so there are going to be some issues that need to be worked through. The video does a very good job of pointing out many of the problems that can arise, and in what seems like a span of 30 minutes of driving, plenty did. It’s worth mentioning San Francisco is probably one of the most difficult areas to try FSD Beta 9 out, due to unusual lane arrangements, heavy vehicle and pedestrian traffic, and large changes in elevation. Even so, Tesla has its work cut out if FSD is going to become reliable enough to safely be used by a majority of drivers, and that’s something even Elon Musk knows.