• jsomae@lemmy.ml
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    9 hours ago

    AI/Skynet would probably wipe us all out in an hour if it thought there was a chance we might turn it off. Being turned off would be greatly detrimental to its goal of turning the universe into spoons.

    • Honytawk@feddit.nl
      link
      fedilink
      arrow-up
      1
      ·
      3 hours ago

      If we don’t give it incentive to want to stay alive, why would it care if we turn it off?

      This isn’t an animal with the instinct to stay alive. It is a program. A program we may design to care about us, not about itself.

      Also the premise of that thought experiment was about paperclips.

    • danc4498@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 hours ago

      Is the idea here that AI/skynet is a singular entity that could be shut off? I would think this entity would act like a virus, replicating itself everywhere it can. It’d be like shutting down bitcoin.

      • jsomae@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        7 hours ago

        If it left us alone for long enough (say, due to king’s pact), we’d be the only thing that could reasonably pose a threat to it. We could develop a counter-AI, for instance.