As far as we know, for two reasons, maybe three. First, most of them paid $ 10,000 to access FSD, so naturally they want to get their money’s worth. It also explains why some of the people we see in these clips are so mistaken about the performance of the system – they subconsciously experience a form of buyer’s remorse where they don’t want to face the facts that FSD doesn’t really offer the level 5 autonomy, not by far.
Second is the fact that they really love Tesla and the evolution of technology and would love to be a part of it. By testing a product that is likely to end up being what it was promised, they will have a sense of accomplishment when or if it does.
Finally, it makes them feel like they are part of something special, an exclusive group of people working collectively to advance humanity. Most of us will never experience this through our work or through anything we do, so this is a once in a lifetime opportunity to give purpose to your life on a larger level.
Add it all together – plus the promise of playing video games on your commute someday – and what you get are people acting against anything common sense tells you while putting everyone else in danger. around them on the road.
The first clip below shows a Tesla Model 3 attempting an unprotected left turn on a two-lane, three-lane road with a splitter. By all accounts, this is a difficult maneuver, especially for someone who is inexperienced and might have a hard time getting close to the speed of an approaching vehicle. Until now, this person would have been a learner driver, but if you think about it, that’s precisely what AI is in Tesla’s FSD: a 16-year-old who is familiar with how everything works in Tesla’s FSD. the street.
The tricky part of FSD, however, is that he speaks a totally different language than ours, so not only does he have to learn about stop signs and road markings and so on, but we also need to understand how to communicate with each other. It’s no easy task, and after years of bullish promises, it seems even Elon Musk has come to admit it.
So the Model 3 has to cross three lanes of traffic before merging with the leftmost lane on the other side, all in one (preferably smooth) movement. How’s it going to do? Well the first three times he just says “no” and turns right instead, circling around to try again. The fourth time is a charm, apparently, but not if you were the driver of that money SUV who had to slow down and move to the right of their lane to make sure he was able to avoid boning the Tesla that was really taking its time crossing those three lanes.
OK, now that it’s done, that means he’s learned something and it’s going to be more and more fluid each time. You might think so, but on the fourth attempt the car was too eager to go left so it just veers into the oncoming lane before even reaching the main road. The driver does not intervene, so the Tesla crosses all three lanes halfway and, although the road is clear, it again says “no” and chooses the safety of a right turn instead.
We can’t be sure why the vehicle decided to repeat the same behavior from its fifth attempt to the sixth and seventh – perhaps because the driver did not intervene to correct it? – but it’s scary to watch. Also, as the driver rightly points out, it gives the car’s cameras the worst possible angle to see traffic on the main road, so it makes absolutely no sense.
The second video shows another Tesla running FSD Beta 9 as it tries to blend into a reasonably busy highway – a maneuver that a human driver would have no problem performing. The system seems to ignore the first gap between a Jeep Wrangler and an Acura TLX and aim for the larger, admittedly, ahead of the Japanese sedan.
However, the cameras fail to pick up the Chevrolet Equinox taking that seat while passing out of the middle lane, so the car is not left in an awkward position. The merge zone is about to end and it’s going way too fast as he was trying to pass the TLX. So what does the system do?
Well, it slows down considerably by blocking half of the exit ramp as cars are forced to go to the right to avoid completing the AND. Then, the FSD does what it always does when it cannot follow the intended route: it redirects itself to allow itself another passage. In the end, it’s not the AI that’s likely to be in a hurry, so if the driver doesn’t like it, they can get behind the wheel at any time and give the learner driver some time to smoke a cigarette and calm her nerves down.
EV drivers in general and Tesla drivers in particular aren’t exactly the most popular on the road, and while it happens for all the wrong reasons, the kind of behavior these FSD test volunteers show. Beta 9 is not going to help.
On top of that, at some point something bad is going to happen, and while Tesla won’t be legally responsible for it, we’ll all know he’s the moral perpetrator for allowing the regular Joes to perform testing. beta for critical security. system on public roads. Will it hurt Tesla’s public image and sales? Probably not, but it’s pretty overwhelming for us as a species if that’s all we care about most.