Tesla’s Latest FSD Beta Doesn’t Seem Ready For Public Use, Which Raises Big Questions

What I like about this test is that it presents a very good mix of everyday, normal driving situations in an environment with a good mix of traffic density, road complexity, lighting conditions, road markings, and more. In short, reality, the same sort of entropy-heavy reality all of us live in and where we expect our machines to work.

There’s a lot that FSD does that’s impressive when you consider that this is an inert mass of steel and rubber and silicon that’s effectively driving on its own through a crowded city. We’ve come a long way since Stanley the Toureg finished the DARPA Challenge back in 2006, and there’s so much to be impressed by.

At the same time, this FSD beta proves to be a pretty shitty driver, at least in this extensive test session.

Anyone arguing that FSD in its latest state drives better than a human is either delusional, high from the fumes of their own raw ardor for Elon Musk or needs to find better-driving humans to hang out with.

FSD drives in a confusing, indecisive way, making all kinds of peculiar snap decisions and generally being hard to read and predict to other drivers around them. Which is a real problem.

Advertisement

Drivers expect a certain baseline of behaviors and reactions from the cars around them. That means there’s not much that’s more dangerous to surrounding traffic than an unpredictable driver, which this machine very much is.

And that’s when it’s driving at least somewhat legally; there are several occasions in this video where traffic laws were actually broken, including two instances of the car attempting to drive the wrong way down a street and into oncoming traffic.

Advertisement

Nope, not great.

In the comments, many people have criticized Kyle, the driver/supervisor, for allowing the car to make terrible driving decisions instead of intervening. The reasoning for this ranges from simple Tesla-fan-rage to the need for disengagements to help the system learn, to concern that by not correcting the mistakes, Kyle is potentially putting people in danger.

Advertisement

They’re also noting that the software is very clearly unfinished and in a beta state, which, is pretty clearly true as well.

These are all reasonable points. Well, the people just knee-jerk shielding Elon’s Works from any scrutiny aren’t reasonable, but the other points are, and they bring up bigger issues.

Advertisement

Specifically, there’s the fundamental question about whether or not it makes sense to test an unfinished self-driving system on public roads, surrounded by people, in or out of other vehicles, that did not agree to participate in any sort of beta testing of any kind.

You could argue that a student driver is a human equivalent of beta testing our brain’s driving software, though when this is done in any official capacity, there’s a professional driving instructor in the car, sometimes with an auxiliary brake pedal, and the car is often marked with a big STUDENT DRIVER warning.

Advertisement

Image for article titled Tesla's Latest FSD Beta Doesn't Seem Ready For Public Use, Which Raises Big Questions
Image: JDT/Tesla/YouTUbe

I’ve proposed the idea of some kind of warning lamp for cars under machine control, and I still think that’s not a bad idea, especially during the transition era we find ourselves in.

Advertisement

Of course, in many states, you can teach your kid to drive on your own without any special permits. That context is quite similar to FSD beta drivers since they don’t have any special training beyond a regular driver’s license (and no, Tesla’s silly Safety Score does not count as special training).

In both cases, you’re dealing with an unsure driver who may not make good decisions, and you may need to take over at a moment’s notice. On an FSD-equipped Tesla (or really any L2-equipped car), taking over should be easy, in that your hands and other limbs should be in position on the car’s controls, ready to take over.

Advertisement

In the case of driving with a kid, this is less easy, though still possible. I know because I was once teaching a girlfriend of the time how to drive and had to take control of a manual old Beetle from the passenger seat. You can do it, but I don’t recommend it.

Of course, when you’re teaching an uncertain human, you’re always very, very aware of the situation and nothing about it would give you a sense of false confidence that could allow your attention to waver. This is a huge problem with Level 2 semi-automated systems, though, and one I’ve discussed at length before.

Advertisement

As far as whether or not the FSB beta needs driver intervention to “learn” about all the dumb things it did wrong, I’m not entirely sure this is true. Tesla has mentioned the ability to learn in “shadow mode” which would eliminate the need for FSD to be active to learn driving behaviors by example.

As far as Kyle’s willingness to let FSD beta make its bad decisions, sure, there are safety risks, but it’s also valuable to see what it does to give an accurate sense of just what the system is capable of. He always stepped in before things got too bad, but I absolutely get that this in no way represents safe driving.

Advertisement

At the same time, showing where the system fails helps users of FSD have a better sense of the capabilities of what they’re using so they can attempt to understand how vigilant they must be.

This is all really tricky, and I’m not sure yet of the best practice solution here.

Advertisement

This also brings up the question of whether Tesla’s goals make sense in regard to what’s known as their Operational Design Domain (ODD), which is just a fancy way of saying “where should I use this?”

Tesla has no restrictions on their ODD, as referenced in this tweet:

Advertisement

This raises a really good point: should Tesla define some sort of ODD?

I get that their end goal is Level 5 full, anywhere, anytime autonomy, a goal that I think is kind of absurd. Full Level 5 is decades and decades away. If Tesla freaks are going to accuse me of literally having blood on my hands for allegedly delaying, somehow, the progress of autonomous driving, then you’d think the smartest move would be to restrict the ODD to areas where the system is known to work better (highways, etc) to allow for more automated deployment sooner.

Advertisement

That would make the goal more Level 4 than 5, but the result would be, hopefully, safer automated vehicle operation, and, eventually, safer driving for everyone.

Trying to make an automated vehicle work everywhere in any condition is an absolutely monumental task, and there’s still so so much work to do. Level 5 systems are probably decades away, at best. Restricted ODD systems may be able to be deployed much sooner, and maybe Tesla should be considering doing that, just like many other AV companies (Waymo, Argo, and so on) are doing.

Advertisement

We’re still in a very early transition period on this path to autonomy, however that turns out. Videos like these, that show real-world behavior of such systems, problems and all, are very valuable, even if we’re still not sure on the ethics of making them.

All I know is that now is the time to question everything, so don’t get bullied by anyone.

TikToker With A Viral Video Of New Tesla With Improperly Installed Airbag Didn’t Actually Own The Car

Illustration for article titled TikToker With A Viral Video Of New Tesla With Improperly Installed Airbag Didn't Actually Own The Car

Image: Tesla/TikTok

Despite what my recent colonic lab results suggest, I’m human. And, as a human, I make mistakes. Statistically, I suspect I make many more mistakes than an average human. One of these mistakes was taking a TikToker at their word regarding a Tesla they claimed to own. The TikToker in question is the one we wrote about last week, who claimed to be taking delivery of his new Tesla when the airbag came off in his hand. According to another video, it appears he did not own the car he sat in, and the cops told him to stay clear of the Tesla Store.

Advertisement

Here’s the video where the TikToker, bird owner, and hopeful Tesla owner, Rico Kimbrough, comes clean regarding ownership of the car:

And, for reference, here was the TikTok that started all of this mess in the first place:

Somehow, I suspect Elon will be able to weather the financial fallout of this okay, despite Kimbrough’s concern. And, yes, to the Children of Elon—that angelic, pious group of noble humans who, for various curious reasons, have tied their personal identities to a for-profit company, I humbly and sincerely apologize for taking the word of a man who claimed to have just bought a new Tesla via a series of videos.

I did reach out to Kimbrough, and have continued to, so far to no avail. The original story was as much about the sensation the video caused as the event depicted, because, as we all know, Tesla is an automaker that commands an awful lot of public attention, as the original TikTok video going viral demonstrates.

Advertisement

In fact, I sort of addressed all of this in the original article:

Of course, Tesla is hardly the only carmaker to experience quality issues, but here’s the thing about Tesla: if you have a car people are so excited about that they post dozens of videos of the process, then when they incidentally document a glaring issue, you have to expect attention for that, too.

I’m saying this because I already know my social media is about to be clogged with quivering Tesla-stans ranting at me about my focus on Tesla’s failings and my cruel, miserable bias against Tesla and sweet, innocent Elon Musk, but the reality is that nothing’s free.

If there’s a car brand with so much popular culture clout and attention and positivity that—as happened to me just today—I get emails pitching stories about how Tesla owners do better on online dating sites, then you’ve got to accept the flip side of that valuable attention-coin, which is that if that brand fucks something up, that gets talked about, too.

Advertisement

I also did hedge regarding the possibility that Kimbrough might not be telling the truth in that earlier article:

And, even if, wildly improbably, Kimbrough isn’t really the owner, and he and his birds just staged an absurdly involved hoax to discredit Tesla for clicks or some other incomprehensible reason, and he was just sitting in a car he didn’t own with an airbag that comes off, I don’t see how that’s any better.

Advertisement

Clearly, I was very wrong about the “wildly” and “improbably” part, suggesting I’m quite naive about the motives and ethics of TikTokers.

Now, the real issue is none of this changes that the airbag came off in our less-than-forthright friend’s hand.

Advertisement

Of course, his admission that the car wasn’t his could mean there’s a whole greater level of nefarious activities going on here, activities that would require a lot more than getting in a car and shooting some videos.

I suppose it’s possible Kimbrough’s intent from the start was to specifically remove the airbag unit, an act that requires tools and at least a bit of research, all in an effort to cause Tesla to look bad.

Advertisement

I’m skeptical this was the goal, as all of Kimbrough’s videos prior to the airbag falling out were wildly pro-Tesla. The man certainly presents himself as a fan in a convincing manner, though he did also present himself as a new owner with some conviction as well.

So, yes, that’s possible. It’s also possible that this is a manufacturing defect, from a company with a rich, lavish history of defects, and even if the car was in Transport Mode (see previous article about why that should not affect whether or not an airbag was properly installed) a car making it to the lot with an improperly installed airbag is not great, no matter who owns the car or who’s lying about owning the car.

Advertisement

I’m positive no matter what, that airbag would have been sorted very early within the ownership of the car—likely before it left the lot—but there’s no reason not to call out such issues, especially when they generate as much attention as this one did, long before any of the many media outlets that covered it wrote about it.

So, to all of you Tesla-stans out there who remain very, very engaged with any and all interactions that involve Teslas, again, my apologies. I’m sorry. The TikToker did not own that car, and I dearly hope that his claims of ownership — and my amplification of those claims — did not cause you any harm, somehow.

Advertisement

But maybe still make sure your new Teslas have their airbags properly installed, and, yes, I still think that yoke sucks.

If I can get through to Kimbrough directly, I will update accordingly. He does seem to be a genuine Elon/Tesla fan, though, so I maybe shouldn’t get my hopes up that he’ll want to talk to me.

For GREAT deals on a new or used Nissan check out Mossy Nissan National City TODAY!