Is Tesla FSD Beta Safe? (Part 2)



The Tesla FSD City Streets Beta is now under public testing – meaning thousands of Tesla customers now have access to experimental software as issues are ironed out by Tesla. In the last video we found out that the testing program isn’t really a testing program – but the most important question we can ask is – “Is Tesla FSD Beta Safe”?

This video is Part 2 of a series on Tesla FSD Beta Testing and explores if Tesla’s testing for FSD City Streets Beta is considered safe. Do Tesla follow best-practice? How do they make sure that no one will get hurt during the tesdting process? Have they already messed up?

Phil Koopman’s Website: https://users.ece.cmu.edu/~koopman/
Phil Koopman’s LinkedIn: https://www.linkedin.com/in/philip-koopman-0631a4116/
Jason from the Autopian helped with the Tullock Spike background: https://twitter.com/JasonTorchinsky

SAEJ3018 ($80): https://www.sae.org/standards/content/j3018_202012/
AVSC00001201911 (similar to SAEJ3018 – but free to download): https://www.sae.org/standards/content/avsc00001201911/
NTSB Report about Uber 2018 fatality: https://data.ntsb.gov/Docket/?NTSBNumber=HWY18MH010

CONNECT WITH ME!

🐦 Twitter: https://twitter.com/MoodyHikmet
💬 Discord: https://discord.gg/kgznd5x
🎮 Twitch: https://twitch.tv/sadmoody

0:00 – Intro
1:01 – Safety-Critical Systems
5:47 – Tesla Full Self-Driving
10:06 – FSD City Streets Beta
13:21 – Tesla Safety Score
19:42 – Risks of Gamification
24:58 – Test Driver Best-Practice
27:54 – Tesla Safety-Driver Training
29:53 – Tesla Safety-Driver Monitroing
35:01 – Expelled from FSD Beta
40:33 – How to NOT Deploy Software Safely
48:13 – Where are the FSD Beta Accidents?
58:08 – Outro

#fsdbeta #tesla #teslafsd

source

30 thoughts on “Is Tesla FSD Beta Safe? (Part 2)”

  1. I do have a question: if a future full self driving car gets into an accident, as in level 5, who would assume responsibility? The driver who no longer needs/wants to pay attention, or the manufacturer, or something/someone else.

    Reply
  2. In Norway the roads are covered in snow half the year and even road signs will be covered in snow a lot of the time. I suspect there will be a wave of whole new kind of accident happening when FSD starts being used. They won't fail at the things humans are bad at though. lol

    Reply
  3. FSD beta has been out long enough that we can officially declare, yes it is safe. Only thing you need is statistics, it's really not something you can argue about.

    And please, don't bring nonsense like rolling stop signs as unsafe. It is indeed illegal, but it's also something almost any driver do, and it's pretty disingenuous to act like it's actually risky

    Reply
  4. 27:15 It's always perplexed me that the same guy who is fine with simulators and rigorous training/testing for Dragon pilots going into space nonchalantly lets engineers ship buggy FSD to 100,000 randos. If NASA didn't require that level of testing, would Dragon ship the same way? Why hasn't NHSTA protected consumer drivers with at least a similar level of friction?

    Before anyone gets cute, I know space is vastly more dangerous than the road, but as Mahmood is pointing out, this is an entire (often buggy) system we're testing. 10.3's automatic emergency braking bug (that was FIRMWARE level!) could've killed me if someone were closer behind on the highway.

    Reply
  5. Aaah…. yes! I'm quite familiar with those Twitter users. They jump on every comment criticizing Tesla/SpaceX/Boring Company/Neuralink/SolarCity to defend their cult leader. Ended up blocking them as they were contributing nothing and kept popping up everywhere even in topics which I unfollowed (Twitter algorithms suck). Anyways, in my opinion, Tesla is still allowed to exist because those responsible in taking action are either in on it or are being held back by others that have a lot of $$$ to lose.

    Reply
  6. 46:16 Thanks for covering this. The 10.3 rollout was absolutely unacceptable. I had no idea they toggled off the items that were malfunctioning.😣 Communication is one of Tesla's biggest improvement zones.

    Reply
  7. You've disproven your entire premise. You keep posting the graph showing as a fact that overall safety goes down in the "Valley of Degraded Supervision"; but your guest stated at the beginning that this may occur, so this is apparently just a hypothesis. And you yourself acknowledge that FSD beta has a very good safety record ("Where are the FSD Beta Accidents?"); in fact, it's exemplary. So this clearly isn't occurring yet. If it does begin to occur, then Tesla can take various measures to bring this back under control.

    You also fail to point out, although it's in some of the text, that there are two primary components to FSD; Autosteer on City Streets, which is what the beta test program is, and Navigate on Autopilot (NOA). NOA is fully released to all who purchased FSD, and works extremely well—I have myself driven thousands of miles on NOA, with only a handful of interventions or disengagements. As you pointed out in your video, this is a much easier problem than city streets, which of course is why it was released first. But the point is that those who purchased FSD are still getting a significant benefit from it; just not the entire benefit.

    While those who bought it early hoping for full implementation certainly have reason to be unhappy, they are free to file a class action lawsuit if they choose. As far as I know, they haven't done so. Meanwhile, Tesla now allows users to purchase FSD a month at a time on a subscription basis; so those who want the benefits of NOA, e.g. for a road trip, may purchase it just for the trip.

    You and many others have highlighted Musk's frequent projections about when higher levels of automation will be achieved. In a recent talk, he dealt credibly with the problem of local minima–that each time they thought they were approaching a goal, it turned out that the actual peak was still out ahead. Those who accuse him of lying—you didn't—have the burden to prove that he knew at the time he made these projections that they couldn't be achieved. Otherwise they weren't lies, just projections that turned out in the cold light of day to be false.

    You have also not acknowledged the dramatic progress being made, since starting from scratch with the NN version late last year ( I'm sure someone will fact-check me on when this occurred). Yes, you can cherry-pick public videos and find occasional examples of FSD beta attempting very dangerous things. But apparently every one of these was intercepted by the driver, as intended; and with every release there are more and more testers posting videos of longer and longer drives with no disengagements or interventions.

    Although I didn't listen through to the end, as far as I could tell, you also did not acknowledge that Tesla will be bringing one of the most powerful computers in the world (Dojo) on line this year, for the sole purpose of training the neural networks. Further the number of testers sending FSD beta errors into the mother ship for correction has increased by a factor of 50 in the last six months or so, with probably 50 million or so miles being driven every month on FSD beta. As you said, where are all the accidents, if FSD beta is so unsafe?

    Reply
  8. Thanks for making these videos, Mahmood. I consider myself pretty knowledgable on Tesla grift but there is a lot of new info here I wasn't aware of. Remotely disabling safety features without letting their user know, good lord they're so negligent.

    Reply
  9. Would love to watch your conversation with Professor Koopman. Enjoyed the clips included here and want more. Torch isn't usually my cup of tea, but some will enjoy that too. He may surprise me, especially in conversation with you.

    Reply
  10. Incredible, thank you for working so hard on this video! I learned so much about safety critical systems and testing procedures. You have a real knack for making this sort of info very approachable 😁 (Also the spike part had me rolling)

    Reply
  11. I'm just wondering just large portion of tesla profits comes from selling people this 12k software. If you're buying a new model 3, that is quite a large portion of it's base price. I believe tesla has to push fsd to people, they would not be profitable without it.

    Reply
  12. Just spent a full hour of my life which I'll never get back on watching part two and don't regret a minute of it. Thanks for the great work. It's a pity you only have 2k subscribers when the things you publish are such meticulously researched and well presented must-watch essentials for everybody involved or even just seriously interested in this topic.

    Reply
  13. Dr. Hikmet,

    I've posted this comment before, but somehow it keeps disappearing within moments after being posted. One poster below suggested that this may be because it contains an external link to the NHTSA document I referenced for vehicle fatality rates. I'm reposting it without that external link, in hopes that this was the cause, and it won't disappear and you can respond to it.

    You quote Sam Pelzman as saying that "Safety regulation has had no effect on the highway death toll. There is some evidence that regulation may have increased … the total number of accidents". This is from "The Effects of Automobile Safety Regulation", which was published 47 years ago, in 1975. At that time, the motor vehicle fatality rate was 3.35/100 million miles. In 1966, the first year in which seat belts were required to be installed in American cars, the rate was 5.55/100 million miles; so by the time Pelzman wrote this, the fatality rate had already been reduced by 40% since nine years earlier. If not by safety regulations, then by what?

    And since his paper was published in 1975, the fatality rate has been further reduced from 3.35/100 million miles to 1.11/100 million miles (in 2019, the last year data is available for), which is another 67% reduction. Is it possible—just possible—that a nearly 50-year old paper that was provably wrong even when it was released, and has since proven to be spectacularly wrong, may not be the best basis for your claims about Tesla FSD beta?

    Reply
  14. This is exactly the mindset of Musk with everything he does – reapply software development methods to everything, this thinking is not limited to FSD, but also to everything Tesla does, including manufacturing. As someone who knows a thing or two about manufacturing, I was laughing my ass off when Elon talked about "Alien Dreadnought" of manufacturing, where "air friction will become problem", because they will be able to make the manufacturing lines so fast and how it will be so much automated and self-sufficient. This guy is so clueless about most of the things, yet he was able to build this aura of everyman-genius-Tony-Stark around him, people take for granted everything he says, it is unbelievable.

    Reply
  15. Tesla's approach has been criticized nearly from the beginning. It's quite a bit different from other companies approaching FSD in that it's simultaneously positioned as a "luxury product" (exclusive, expensive) and also implemented "quick and dirty" (only cameras, brute force data collection, poorly controlled testing).

    And…in 18 months their auto business may become irrelevant, given that the competition seems to be progressing well towards robotaxi and robotruck deployments using more conservative engineering practices and limited coverage areas while Tesla FSD is…definitely improving in some respects, but also taking on a far more generalized version of the problem – skipping to level 5 instead of building experience in level 4. A solved level 4 deployed at scale is sufficient to capture the market for most transportation needs, because it's effectively a small generalization on pre-routed public transit, able to do the same things as a bus system but in a more dynamic way and with lower operational cost. Busy routes allocate robobus shares and lower traffic ones cars or bikes. This is something that Uber etc have studied already with human drivers to some success; cost is the main barrier keeping things at the current equilibrium.

    Reply
  16. AI Drivr is a wreckless clown who couldn't even tell he wasn't using the beta on his "lets see what i can hit" test (and you didn't catch that either). Also, your comment in tesla fires is way off. They don't catch fire at high rates compared to equivalent ice cars. not even close

    Reply
  17. Unless Tesla adds IR LEDs to the cabin cam of older cars like they did for the newer S/X, its useless at night unless they apply so much gain so as to get noised out. If using the screen for light, thats not reliable enough based in the response curve of their camera.

    Reply
  18. Their engineering approach will never produce a full self driving vehicle period.
    Elon is too much of a rocket man now to see it and he doesn't pay engineers enough to have any one on payroll notice it either :p

    Reply

Leave a Comment