• Cyrus Draegur@lemm.ee
    link
    fedilink
    English
    arrow-up
    203
    arrow-down
    4
    ·
    edit-2
    2 days ago

    Autopilot literally switched ITSELF off less than half a second from the moment of impact. it didn’t try to stop the car, it just shut itself off so it couldn’t be blamed.

    Imagine if someone whipped throwing knives at your back and then tried to argue “but your honor I was not holding any knives at the time of the stabbing”

    Fuck Tesla, fuck Elon, fuck every simp who shits excuses out their mouths for him

    (I mean, not YOU, you aren’t doing any of those things; I’m just saying, those people. In general.)

    • FauxLiving@lemmy.world
      link
      fedilink
      arrow-up
      68
      arrow-down
      1
      ·
      2 days ago

      It’s like a pilot bailing out of a plane and then claiming he was not responsible for the crash because he was in Vegas at the time the plane crashed.

    • pyre@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      2 days ago

      it’s always been doing this. it’s so that they claim AP wasn’t active during the crash and evade liability

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 days ago

      I think elon made sure that it switches off so he doesnt get liability that the autopilot is at fault.

    • Jesus@lemmy.world
      link
      fedilink
      arrow-up
      20
      arrow-down
      10
      ·
      2 days ago

      AP is supposed to disable itself if a fault or abnormality is detected. Pretty much all advanced cruise control systems do this.

      I don’t think it’s fair to say the car was hiding evidence of AP being used unless it was intentionally logging the data in shady way. We’d need to see the logs of the car, and there are some roundabout ways for a consumer to pull those. That would probably be an interesting test for someone on YouTube to run.

      • mosiacmango@lemm.ee
        link
        fedilink
        arrow-up
        41
        arrow-down
        1
        ·
        edit-2
        2 days ago

        These systems disable right before a crash because the national traffic safety org in the US requires manufacturers to report if these systems were engaged during an accident.

        It is not for safety or because of a malfunction, it’s for marketing. Car companies dont want the features that they sell for 3-8k coming up all the time in crash statistics.

        Tesla is the biggest offender here, likely due to vehicles sold, but also due to their camera only system and their aggressively false “full self driving” and “autopilot” marketing that far over promises.

        • Jesus@lemmy.world
          link
          fedilink
          arrow-up
          5
          arrow-down
          1
          ·
          2 days ago

          Just saying I’d like to see some more data. I get that Musk is not someone who should be trusted. Especially if it’s around complying with regulators.

          That said, I could see that system being disengaged by some intended safety triggers.

          • Redjard@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            8
            ·
            2 days ago

            At the very least the system should initiate an emergency break when it disengages like that and there is no conflicting human input.

            • Jesus@lemmy.world
              link
              fedilink
              arrow-up
              4
              ·
              2 days ago

              100% agree. My stupid Volvo does that, and it doesn’t have lidar or a million cameras around it.