• SabinStargem@lemmy.today
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 day ago

    The obvious answer is to autopilot into ICE and Trump Regime officials. Elon pays the fine, the world is ridden of MAGATs, and one less Tesla on the road. D, D, D.

    /s.

    • NιƙƙιDιɱҽʂ@lemmy.world
      link
      fedilink
      arrow-up
      14
      ·
      2 days ago

      Nah, it just disengages a fraction of a second before impact so they can claim “it wasn’t engaged at the moment of impact, so not our responsibility.”

      There were rumours about this for ages, but I honestly didn’t fully buy it until I saw it in Mark Rober’s vison vs lidar video and various other follow-ups to it.

      • Tja@programming.dev
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        It not about responsibility, it’s about marketing. At no point do they assume responsibility, like any level 2 system. It would look bad if it was engaged, but you are 100% legally liable for what the car does when on autopilot (or the so called “full self driving”). It’s just a lane keeping assistant.

        If you trust your life (or the life of others) to a a lane keeping assistant you deserve to go to jail, be it Tesla, VW, or BYD.

        • NotMyOldRedditName@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          edit-2
          2 days ago

          It turns off, but it’s likely so the AEB system can kick in.

          AP and AEB are separate things.

          Also all L2 crashes that involve an air bag deployment or fatality get reported if it was on within something like 30s before hand, assuming the OEM has the data to report, which Tesla does.

          Rules are changing to lessen when it needs to be reported, so things like fender benders aren’t necessarily going to be reported for L2 systems in the near future, but something like this would still be and alway has.

          • xeekei@lemm.ee
            link
            fedilink
            English
            arrow-up
            0
            ·
            1 day ago

            Ok but if Tesla’s using that report to get out from liability, we still’ve a damn problem

            • NotMyOldRedditName@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              19 hours ago

              If it’s a L2 system the driver is always liable. The report just makes sure we know it’s happening and can force changes if patterns are found. The NHSTA made Tesla improve their driver monitoring based off the data since that was the main problem. The majority of accidents (almost all) were drunk or distracted drivers.

              If it’s a L4 system Tesla is always liable, we’ll see that in June in Austin in theory for the first time on public roads.

              The report never changes liability, it just let’s us know what the state of the vehicle was for the incident. Tesla can’t say the system was off because it was off 1 second before because we’ll know it was on prior to that. But that doesn’t change liability.

  • guywithoutaname@lemm.ee
    link
    fedilink
    arrow-up
    11
    ·
    2 days ago

    I’d imagine you are always responsible for what you do when you’re driving, even if a system like autopilot is helping you drive.

  • randoot@lemmy.world
    link
    fedilink
    arrow-up
    17
    ·
    2 days ago

    Ha only if. Autopilot turns off right before a crash so that Tesla can claim it was off and blame it on the driver. Look it up.

    • Tja@programming.dev
      link
      fedilink
      arrow-up
      3
      ·
      1 day ago

      The driver is always at blame, even if it was on. They turn it off for marketing claims.

      PS: fuck elon

      • Sonicdemon86@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        Mark Rober had a video on autopilot of several cars and he used his Tesla. The car turned off the autopilot when he crashed through a styrofaom wall.

        • randoot@lemmy.world
          link
          fedilink
          arrow-up
          9
          ·
          2 days ago

          This is how they claim autopilot is safer than human drivers. In reality Tesla has one of the highest fatality rates but magically all of those happen when autopilot was “off”

  • supersquirrel@sopuli.xyz
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    2 days ago

    Unironically this is a perfect example of why AI is being used to choose targets to murder in the Palestinian Genocide or in cases like DOGE attacking the functioning of the U.S. government, also US healthcare company claims of denial or collusion of landlord software to raise rent.

    The economic function of AI is to abdicate responsibility for your actions so you can make a bit more money while hurting people, and until the public becomes crystal clear on that we are under a wild amount of danger.

    Just substitute in for Elon the vague idea of a company that will become a legal and ethical escape goat for brutal choices by individual humans.

      • AnarchistArtificer@slrpnk.net
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I did an internship at a bank way back, and my role involved a lot of processing of spreadsheets from different departments. I automated a heckton of that with Visual Basic, which my boss was okay with, but I was dismayed to learn that I wasn’t saving anyone’s time except my own, because after the internship was finished, all of the automation stuff would have to be deleted. The reason was because of a rule (I think a company policy rather than a law) that required that any code has to be the custody of someone, for accountability purposes — “accountability” in this case meaning “if we take unmaintained code for granted, then we may find an entire department’s workflow crippled at some point in the future, with no-one knowing how it’s meant to work”.

        It’s quite a different thing than what you’re talking about, but in terms of the implementation, it doesn’t seem too far off.

  • ZkhqrD5o@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    2 days ago

    Tldr: Take the train and be safe.

    Rant: In the EU, you are 35x more likely to die from a car crash, compared to a train crash. The union has created the so-called Vision Zero program, which is designed to reach zero driving deaths by some arbitrarily chosen date in the future. And of course it talks about autonomously driving cars. You know, crazy idea, but what if instead of we bet it all on some hypothetical magic Jesus technology that may or may not exist by the arbitrarily chosen date and instead focus on the real world solution that we already have? But well, the car industry investors would make less money, so I can answer that myself. :(

    Edit: Also, Musk is a Nazi cunt who should die of cancer.

    • Tja@programming.dev
      link
      fedilink
      arrow-up
      1
      ·
      1 day ago

      Well, there is no train station at my house. Or Aldi. Or my kids Kindergarten. And I live Germany, where public transport is excellent on a global level (memes about Deutsche Bahn aside).

      Cars will be necessary for the foreseeable future. Let’s make them as safe as possible while investing in public transport, they are not mutually exclusive.

      PS: fuck Elon.

    • bleistift2@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Speaking as a German: There are fewer train-related deaths because the trains don’t drive.

      • SwingingTheLamp@midwest.social
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        It’s kind of the natural result, because cars are good for:

        • navigating a landscape designed to exclude anything but cars
        • conspicuous consumption
        • identity signaling

        And they’re really bad for:

        • people
        • the environment
        • transportation

        Do an honest evaluation, and “don’t use cars” is the inevitable conclusion.