TL;DR: Self-Driving Teslas Rear-End Motorcyclists, Killing at Least 5

Brevity is the spirit of wit, and I am just not that witty. This is a long article, here is the gist of it:

  • The NHTSA’s self-driving crash data reveals that Tesla’s self-driving technology is, by far, the most dangerous for motorcyclists, with five fatal crashes that we know of.
  • This issue is unique to Tesla. Other self-driving manufacturers have logged zero motorcycle fatalities with the NHTSA in the same time frame.
  • The crashes are overwhelmingly Teslas rear-ending motorcyclists.

Read our full analysis as we go case-by-case and connect the heavily redacted government data to news reports and police documents.

Oh, and read our thoughts about what this means for the robotaxi launch that is slated for Austin in less than 60 days.

  • SkunkWorkz@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    9 hours ago

    It’s because the system has to rely on visual cues, since Tesla’s have no radar. The system looks at the tail light when it’s dark to gauge the distance from the vehicle. And since some bikes have a double light the system thinks it’s a car in front of them that is far away, when in reality it’s a bike up close. Also remember the ai is trained on human driving behavior which Tesla records from their customers. And we all know how well the average human drives around two wheeled vehicles.

  • Redex@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    11 hours ago

    Cuz other self driving cars use LIDAR so it’s basically impossible for them to not realise that a bike is there.

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      They call it the Model 3 because the Tesla Organ-Harvester didn’t translate well to Chinese

  • Ledericas@lemm.ee
    link
    fedilink
    English
    arrow-up
    24
    ·
    22 hours ago

    the cybertruck is sharp enough to cut a deer in half, surely a biker is just as vulnerable.

        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          5
          ·
          17 hours ago

          Bahaha, that one is new to me.

          Back when I worked on an ambulance, we called the no helmet guys organ donors.

          This comment was brought to you by PTSD, and has been redacted in a rare moment of sobriety.

          • mutual_ayed@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            14 hours ago

            I also rammed 10cc spikes at the back of the bus, the world needs organ donors and motorcycles provide a great service for that. Hope your EMT career was short lived but rewarding.

        • Excrubulent@slrpnk.net
          link
          fedilink
          English
          arrow-up
          3
          ·
          16 hours ago

          I remember finding a motorcycle community on reddit that called themselves “squids” or “squiddies” or something like that.

          Their whole thing was putting road tyres on dirtbikes and riding urban environments like they were offroad obstacles. You know, ramping things, except on concrete.

          They loved to talk about how dumb & short-lived they were. I couldn’t ever find that group again, so maybe I misremembered the “squid” name, but I wanted to find them again, not to ever try it - fuck that - but because the bikes looked super cool. I just have a thing for gender-bent vehicles.

          • real_squids@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            4
            ·
            14 hours ago

            Calamari Racing Team. It’s mostly a counter-movement to r/Motorcycles, where most of the posters are seen as anti-fun. Their whole thing is that, not just a specific way to ride, they also have a legendary commenter that pays money for pics in full leather.

            • Excrubulent@slrpnk.net
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              7 hours ago

              That’s the one! Thanks, that was un-googleable for me.

              I guess the road-tyres-on-dirt-bikes thing was maybe a trend when I saw the sub.

    • Psythik@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      22 hours ago

      As someone who likes the open sky feeling, this is why I drive a convertible instead.

  • AnimalsDream@slrpnk.net
    link
    fedilink
    English
    arrow-up
    31
    arrow-down
    1
    ·
    edit-2
    5 hours ago

    I imagine bicyclists must be æffected as well if they’re on the road (as we should be, technically). As somebody who has already been literally inches away from being rear-ended, this makes me never want to bike in the US again.

    Time to go to Netherlands.

    • EndlessNightmare@reddthat.com
      link
      fedilink
      English
      arrow-up
      9
      ·
      18 hours ago

      this makes me never want to bike in the US again.

      I live close enough to work for it to be a very reasonable biking distance. But there is no safe route. A high-speed “stroad” with a narrow little bike lane. It would only be a matter of time before some asshole with their face in their phone drifts into me.

      I am deeply resentful of our automobile-centric infrastructure in the U.S. It’s bad for the environment, bad for our wallets, bad for our waistlines, and bad for physical safety.

    • xor@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      7
      ·
      1 day ago

      human driving cars still target bicyclists on purpose so i don’t know see how teslas could be any worse…

      p.s. painting a couple lines on the side of the road does not make a safe bike lane… they need a physical barrier separating the road from them… like how curbs separate the road from sidewalks…

      • AnimalsDream@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        22 hours ago

        I mean yeah, I just said above that someone almost killed me. They were probably a human driver. But that’s a “might happen, never know.” If self driving cars are rear-ending people, that’s an inherent artifact of it’s programming, even though it’s not intentionally programmed to do that.

        So it’s like, things were already bad. I already do not feel safe doing any biking anymore. But as self driving cars become more prevalent, that threat upgrades to a kind of defacto, “Oh, these vast stretches of land are places where only cars and trucks are allowed. Everything else is roadkill waiting to happen.”

  • captainastronaut@seattlelunarsociety.org
    link
    fedilink
    English
    arrow-up
    143
    arrow-down
    1
    ·
    2 days ago

    Tesla self driving is never going to work well enough without sensors - cameras are not enough. It’s fundamentally dangerous and should not be driving unsupervised (or maybe at all).

    • KayLeadfoot@fedia.ioOP
      link
      fedilink
      arrow-up
      87
      arrow-down
      1
      ·
      2 days ago

      Accurate.

      Each fatality I found where a Tesla kills a motorcyclist is a cascade of 3 failures.

      1. The car’s cameras don’t detect the biker, or it just doesn’t stop for some reason.
      2. The driver isn’t paying attention to detect the system failure.
      3. The Tesla’s driver alertness tech fails to detect that the driver isn’t paying attention.

      Taking out the driver will make this already-unacceptably-lethal system even more lethal.

      • jonne@infosec.pub
        link
        fedilink
        English
        arrow-up
        66
        ·
        2 days ago
        1. Self-driving turns itself off seconds before a crash, giving the driver an impossibly short timespan to rectify the situation.
        • KayLeadfoot@fedia.ioOP
          link
          fedilink
          arrow-up
          63
          ·
          2 days ago

          … Also accurate.

          God, it really is a nut punch. The system detects the crash is imminent.

          Rather than automatically try to evade… the self-driving tech turns off. I assume it is to reduce liability or make the stats look better. God.

          • jonne@infosec.pub
            link
            fedilink
            English
            arrow-up
            37
            ·
            edit-2
            2 days ago

            Yep, that one was purely about hitting a certain KPI of ‘miles driven on autopilot without incident’. If it turns off before the accident, technically the driver was in control and to blame, so it won’t show up in the stats and probably also won’t be investigated by the NTSB.

              • KayLeadfoot@fedia.ioOP
                link
                fedilink
                arrow-up
                23
                ·
                2 days ago

                NHTSA collects data if self-driving tech was active within 30 seconds of the impact.

                The companies themselves do all sorts of wildcat shit with their numbers. Tesla’s claimed safety factor right now is 8x human. So to drive with FSD is 8x safer than your average human driver, that’s what they say on their stock earnings calls. Of course, that’s not true, not based on any data I’ve seen, they haven’t published data that makes it externally verifiable (unlike Waymo, who has excellent academic articles and insurance papers written about their 12x safer than human system).

                • NotMyOldRedditName@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  edit-2
                  16 hours ago

                  So to drive with FSD is 8x safer than your average human driver.

                  WITH a supervising human.

                  Once it reaches a certain quality, it should be safer if a human is properly supervising it, because if the car tries to do something really stupid, the human takes over. The vast vast vast majority of crashes are from inattentive drivers, which is obviously a problem and they need to keep improving the attentiveness monitoring, but it should be safer than a human with human supervision because it can also detect things the human will ultimately miss.

                  Now, if you take the human entirely out of the equation, I very much doubt that FSD is safer than a human at it’s current state.

              • jonne@infosec.pub
                link
                fedilink
                English
                arrow-up
                9
                arrow-down
                1
                ·
                2 days ago

                If they ever fixed it, I’m sure Musk fired whomever is keeping score now. He’s going to launch the robotaxi stuff soon and it’s going to kill a bunch of people.

        • NeoNachtwaechter@lemmy.world
          link
          fedilink
          English
          arrow-up
          18
          ·
          2 days ago

          Even when it is just milliseconds before the crash, the computer turns itself off.

          Later, Tesla brags that the autopilot was not in use during this ( terribly, overwhelmingly) unfortunate accident.

      • br3d@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        1
        ·
        edit-2
        1 day ago

        There’s at least two steps before those three:

        -1. Society has been built around the needs of the auto industry, locking people into car dependency

        1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody
        • grue@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          3
          ·
          1 day ago
          1. A legal system exists in which the people who build, sell and drive cars are not meaningfully liable when the car hurts somebody

          That’s a good thing, because the alternative would be flipping the notion of property rights on its head. Making the owner not responsible for his property would be used to justify stripping him of his right to modify it.

          You’re absolutely right about point -1 though.

          • explodicle@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 day ago

            build, sell and drive

            You two don’t seem to strongly disagree. The driver is liable but should then sue the builder/seller for “self driving” fraud.

            • grue@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 day ago

              Maybe, if that two-step determination of liability is really what the parent commenter had in mind.

              I’m not so sure he’d agree with my proposed way of resolving the dispute over liability, which would be to legally require that all self-driving systems (and software running on the car in general) be forced to be Free Software and put it squarely and completely within the control of the vehicle owner.

                • grue@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  1 day ago

                  I mean, maybe, but previously when I’ve said that it’s typically gone over like a lead balloon. Even in tech forums, a lot of people have drunk the kool-aid that it’s somehow suddenly too dangerous to allow owners to control their property just because software is involved.

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      22 hours ago

      they originally had lidar, or radar, but musk had them disabled in the older models.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        16 hours ago

        They had radar. Tesla has never had lidar, but they do use lidar on test vehicles to ground truth their camera depth / velocity calculations.

    • ascense@lemm.ee
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      2 days ago

      Most frustrating thing is, as far as I can tell, Tesla doesn’t even have binocular vision, which makes all the claims about humans being able to drive with vision only even more blatantly stupid. At least humans have depth perception. And supposedly their goal is to outperform humans?

      • TheGrandNagus@lemmy.world
        link
        fedilink
        English
        arrow-up
        26
        ·
        2 days ago

        Tesla’s argument of “well human eyes are like cameras therefore we shouldn’t use LiDAR” is so fucking dumb.

        Human eyes have good depth perception and absolutely exceptional dynamic range and focusing ability. They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised, certainly more so than any computer added to a car.

        And even with all those advantages humans have, we still crash from time to time and make smaller mistakes regularly.

        • NABDad@lemmy.world
          link
          fedilink
          English
          arrow-up
          12
          ·
          1 day ago

          They also happen to be linked up to a rapid and highly efficient super computer far outclassing anything that humanity has ever devised

          A neural network that has been in development for 650 million years.

        • bluGill@fedia.io
          link
          fedilink
          arrow-up
          2
          arrow-down
          4
          ·
          1 day ago

          Anyone who has driven (or walked) into a sunrise/sunset knows that human vision is not very good. I’ve also driven in blizzards, heavy rain, and fog - all times when human vision is terrible. I’ve also not seen green lights (I’m colorblind).

          • TheGrandNagus@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            1 day ago

            Human vision is very, very, very good. If you think a camera installed to a car is even close to human eyesight, then you are extremely mistaken.

            Human eyes are so far beyond it’s hard to even quantify.

            And bullshit on you not being able to see the lights. They’re specifically designed so that’s not an issue for colourblind people.

            • bluGill@fedia.io
              link
              fedilink
              arrow-up
              1
              ·
              1 day ago

              And bullshit on you not being able to see the lights. They’re specifically designed so that’s not an issue for colour blind people

              Some lights are, but not all of them are. I often say I go when the light turns blue. However not all lights have that blue tint and so I often cannot tell the difference between a white light and a green light by color. (but white is not used in a stoplight and I can see red/yellow just fine) Where I live all stoplights have green on the bottom so that is always a cheat I use, but that only works if I can see the relative position - in an otherwise dark situation I only see a light in front of me and not the rest of the structure and so I cannot tell. I have driven where stoplights are not green on bottom and I can never remember if green is left/right.

              Even when the try though, not all colorblind is the same. There may not be a mitigation that will work from two different people with different aspects of colorblind.

            • bluGill@fedia.io
              link
              fedilink
              arrow-up
              1
              arrow-down
              1
              ·
              1 day ago

              Human vision is very, very, very good. If you think a camera installed to a car is even close to human eyesight, then you are extremely mistaken.

              Why are you trying to limit cars to just vision? That is all I have as a human. However robots have radar, lidar, radio, and other options, there is no reasons they can’t use them and get information eyes cannot. Every option has limits.

          • explodicle@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            Bro I’m colorblind too and if you’re not sure what color the light is, you have to stop. Don’t put that on the rest of us.

            • bluGill@fedia.io
              link
              fedilink
              arrow-up
              2
              ·
              1 day ago

              I can see red clearly and so not sure means I can go.

              I’ve only noticed issues in a few situations. When I’m driving at night and suddenly the weirdly aimed streetlight turns yellow - until it changed I didn’t even know there was a stoplight there. The second was I was making a left turn at sunset (sun behind me) and the green arrow came on but the red light remained on so I couldn’t see it was time/safe to go until my wife alerted me.

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      These fatalities are a Tesla business advantage. Every one is a data point they can use to program their self-driving intelligence. No one has killed as many as Tesla, so no one knows more about what kills people than Tesla. We don’t have to turn this into a bad thing just because they’re killing people /s

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    83
    ·
    edit-2
    2 days ago

    Hey guys relax! It’s all part of the learning experience of Tesla FSD.
    Some of you may die, but that’s a sacrifice I’m willing to make.

    Regards
    Elon Musk
    CEO of Tesla

    • Gammelfisch@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 day ago

      +1 for you. However, replace “Regards” with the more appropriate words from the German language. The first with an S, and the second an H. I will not type that shit, fuck Leon and I hope the fucking Nazi owned Tesla factory outside of Berlin closes.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 day ago

        Yes I’m not writing that shit, even in a sarcastic post. Bu I get your drift.
        On the other hand, since you are from Germany, VW group is absolutely killing it on EV recently IMO.
        They totally dominate top 10 EV here in Denmark, with 7 out of 10 top selling models!!
        They are competitively priced, and they are the best combination of quality and range in their price ranges.

  • 0x0@programming.dev
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    4
    ·
    edit-2
    1 day ago

    This is news? Fortnine talked about it two years ago.
    TL;DR Tesla removed LIDAR to save a buck and the cameras see two red dots that the 'puter thinks it’s a far away car at night when indeed it’s a close motorcycle.

    • EndlessNightmare@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 hours ago

      The argument is that humans can drive with just 2 eyes, so cameras are enough. I disagree with this position, given that the limitations of a camera-only system. But that’s what it is.

      Different sensors excel at different tasks and different conditions, and cameras are not always it.

    • LesserAbe@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 day ago

      It’s helpful to remember that not everyone has seen the same stories you have. If we want something to change, like regulators not allowing dangerous products, then raising public awareness is important. Expressing surprise that not everyone knows about something can be counterproductive.

      Going beyond that, wouldn’t the new information here be the statistics?

      • JordanZ@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        1 day ago

        My state allowed motorcycle filtering in 2019 (not the same as California’s lane splitting). They ran a study and found a ton of motorcyclists were being severely injured or killed while getting rear ended sitting at stop lights. Filtering allows them to move to the front of the traffic light while the light is red and traffic is stationary. Many people are super aggravated about it even though most of the world has been doing it basically forever.

      • bluGill@fedia.io
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        like regulators not allowing dangerous products,

        I include human drivers in the list of dangerous products I don’t want allowed. The question is self driving safer overall (despite possible regressions like this). I don’t want regulators to pick favorites. I want them to find “the truth”

        • LesserAbe@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          Sure, we’re in agreement as far as that goes. My point was just the commenter above me was indicating it should be common knowledge that Tesla self driving hits motorcycles more than other self driving cars. And whether their comment was about this or some other subject, I think it’s counterproductive to be like “everyone knows that.”

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        22 hours ago

        Why not? It’s got multiple cameras so could judge distances the same way humans do.

        However there have been both hardware and software updates since most of those, so the critical question is how much of a problem is it still? The article had no info or speculation on that

  • Gork@lemm.ee
    link
    fedilink
    English
    arrow-up
    68
    ·
    2 days ago

    Lidar needs to be a mandated requirement for these systems.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      19
      ·
      edit-2
      1 day ago

      Or at least something other than just cameras. Even just adding ultrasonic senses to the front would be an improvement.

    • ℍ𝕂-𝟞𝟝@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      1 day ago

      Honestly, emergency braking with LIDAR is mature and cheap enough at this point that is should be mandated for all new cars.

      • Nastybutler@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 day ago

        No, emergency braking with radar is mature and cheap. Lidar is very expensive and relatively nascent

    • TrackinDaKraken@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 day ago

      How about we disallow it completely, until it’s proven to be SAFER than a human driver. Because, why even allow it if it’s only as safe?

      • explodicle@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        As an engineer, I strongly agree with requirements based on empirical results rather than requiring a specific technology. The latter never ages well. Thank you.

        • scarabic@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 day ago

          It’s hardly either / or though. What we have here is empirical data showing that cars without lidar perform worse. So it’s based in empirical results to mandate lidar. You can build a clear, robust requirement around a tech spec. You cannot build a clear, robust law around fatality statistics targets.

          • explodicle@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 day ago

            We frequently build clear, robust laws around mandatory testing. Like that recent YouTube video where the Tesla crashed through a wall, but with crash test dummies.

            • scarabic@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 day ago

              Those are ways to gather empirical results, though they rely on artificial, staged situations.

              I think it’s fine to have both. Seat belts save lives. I see no problem mandating them. That kind of thing can still be well founded in data.

      • scarabic@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        This sounds good until you realize how unsafe human drivers are. People won’t accept a self-driving system that’s only 50% safer than humans, because that will still be a self-driving car that kills 20,000 Americans a year. Look at the outrage right here, and we’re nowhere near those numbers. I also don’t see anyone comparing these numbers to human drivers on any per-mile basis. Waymos compared favorably to human drivers in their most recently released data. Does anyone even know where Teslas stand compared to human drivers?

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          15 hours ago

          There’s been 54 reported fatalities involving their software over the years in the US.

          That’s around 10 billion AP miles (9 billion at end of 2024), and around 3.6 billion on the various version of FSD (beta / supervised). Most of the fatal accidents happened on AP though not FSD.

          Lets just double those fatal accidents to 108 to make it for the world, but that probably skews high. Most of the fatal stuff I’ve seen is always in the US.

          That equates to 1 fatal accident every 125.9 million miles.

          The USA average per 100 million miles is 1.33 deaths, so even doubling the deaths it’s less than the current national average. That’s the equivalent of 1.33 deaths every 167 million miles with Tesla’s software.

          Edit: I couldn’t math, fixed it. Also for FSD specifically, very few places have it. Mainly North America, and just recently, China. I wish we had fatalities for FSD specifically.