• boonhet@lemm.ee
    link
    fedilink
    arrow-up
    13
    ·
    edit-2
    2 days ago

    I watched some Tesla-sympathetic youtuber for balance and here are the key points brought up:

    1. He had a death grip on the wheel (because y’know, he knew he was going to crash). Exerting enough force over time on the steering wheel disables autopilot, because the system assumes you want to manually override what it’s doing.

    2. FSD apparently is much more capable, but this Tesla only had the common AutoPilot turned on. Despite having FSD available (Mark apparently claimed he didn’t know he could turn it on without adding a destination)

    3. Mark might have some sort of sponsorship deal with the LIDAR company featured in the video, which is why LIDAR was shown in a much better light (e.g it was shown stopping for a dummy behind the water spray, but in reality a LIDAR based system would just brake for the water spray itself)

    Now all of those might be true, but you’re also correct in that the emergency braking system should be operational even when AP is disabled. Unless the system malfunctioned (just having a dirty camera is enough). I know my Subaru throws out the adaptive cruise ALL the time. Stupid camera based system. You’d think it’s better off because the cameras are at the top of the windshield, compared to most cars front grille mounted radars, but nah, it just keeps turning off.

    • Honytawk@lemmy.zip
      link
      fedilink
      English
      arrow-up
      18
      ·
      edit-2
      2 days ago

      They are free to peer review the test and do it with all the stuff enabled.

      That is how science works.

      But I doubt they will, since this is an inherent problem with using camera vision only. Not with the software of the car. And they most likely know it.

      • KubeRoot@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 days ago

        I will point out, I don’t think “peer review” means repeating the test, it means more generally pointing out issues with the science, right? By that definition, sounds like that’s what they’re doing. That doesn’t make the criticisms inherently valid, but to dismiss it as “they’re free to do their own tests” because “that is how science works” seems dishonest.

        • madnotangry@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          12 hours ago

          Peer review usually means repeating the test and comparing results with the original paper. If peer review can’t get the same results, it means that the first study was faulty or wasn’t described accurately.

      • boonhet@lemm.ee
        link
        fedilink
        arrow-up
        1
        arrow-down
        9
        ·
        2 days ago

        Humans also operate on “camera vision” only in that we see visible light and that’s it. Adding lidar to the system should improve performance over human capability, but camera vision with good enough software (and this is way easier said than done) ought to be able to match human capability. Whether Tesla’s is good enough in FSD mode I have no idea because I have no intention to ever buy one and testing this in a rental is uh… risky, given that they tend to have onboard cameras.

        Of course, if Tesla’s “FSD” branded driver assist suite is actually good enough to beat this test, I reckon Tesla would be quick to prove it to save their own reputation. It’s not that hard to reproduce.

        • LeninOnAPrayer@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          13 hours ago

          https://www.adafruit.com/product/4058?gQT=1

          These are extremely EXTREMELY reliable at detecting physical obstructions. There is no reason but stupidity or being cheap to not implement redundancy into a safety system. This isn’t about implementing “good enough” software. This is about a design choice forced on Tesla engineers by a complete idiot that doubles down on his stupidity when faced with criticism by actually intelligent people.

        • Mandrilleren@lemmy.world
          link
          fedilink
          arrow-up
          8
          ·
          2 days ago

          Not just good enough software. Also good enough cameras and good enough processing power. None of which curenty match humans so this is not a valid argument.

          The camera only system is just worse at everything.

    • greenhorn@lemm.ee
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 days ago

      Even without the fanboys justifications, what did this test prove that the others didn’t, since it didn’t mimic a real world scenario like the tests where the tesla demolished the kid? I’ve driven through fog and lights and heavy rain, but have yet to encounter an unexpected Wile E Cayote wall in the road.

      • helpImTrappedOnline@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        2 days ago

        The absurd test was mostly for the spectacle/views. Sometimes science is doing wacky things because we’re curious to find the limits.

        Someone else mentioned a blue truck at the crest of hill was invisible to the system, resulting in a crash. That’s probably the closest to Wile E scenerio you’re going to get.

    • Manalith@midwest.social
      link
      fedilink
      arrow-up
      2
      ·
      2 days ago

      If nothing else, Mark did say that the company LiDAR supplied the car, but that’s it, they had no say in the test, didn’t give him any money, apparently they did put the video up on their site for a bit, but took it down either because it looked bad given the backlash, or because Mark told them to take it down as it did go against their agreement.

      Of course he could have lied about the spo sponsorship, but he said he’s fine with a lawsuit, so that would be a bold strategy.