• Derproid@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    1 year ago

    So the interesting part in my mind for this is that you would die and be gone, there would just exist another entity that can perfectly replicate you. Take for example the case of there being two of you, which one is the real one? The original? What if I kill the original? Does the new one become the real you? But what if I don’t kill you but let the duplicate replace your life. Are you the real you trapped in some cell, or is the duplicate the real you living your life?

    My point really is that it’s all a matter of perspective. For everyone else the clone would be the real you, but from your perspective you are the real you and the clone stole your life.

    • ℕ𝕖𝕞𝕠@slrpnk.net
      link
      fedilink
      arrow-up
      0
      ·
      1 year ago

      I’m not my body and I’m not my mind. I am the ethical soul, the decision-making process. If the replacement makes all the same decisions I would, it IS me.

      • queermunist she/her@lemmy.ml
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        1 year ago

        What if something like ChatGPT is trained on a dataset of your life and uses that to make the same decisions as you? It doesn’t have a mind, memories, emotions, or even a phenomenal experience of the world. It’s just a large language data set based on your life with algorithms to sort out decisions, it’s not even a person.

        Is that you?

          • queermunist she/her@lemmy.ml
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            1 year ago

            I’m having a hard time imagining a decision that can’t be language based.

            You come to a fork in the road and choose to go right. Obviously there was no language involved in that decision, but the decision can certainly be expressed with language and so a large language model can make a decision.

              • queermunist she/her@lemmy.ml
                link
                fedilink
                arrow-up
                0
                arrow-down
                1
                ·
                edit-2
                1 year ago

                It doesn’t matter how it comes to make a decision as long as the outcome is the same.

                Sorry, this is beside the point. Forget ChatGPT.

                What I meant was a set of algorithms that produce the same outputs as your own choices, even though it doesn’t involve any thoughts or feelings or experiences. Not a true intelligence, just an NPC that acts exactly like you act. Imagine this thing exists. Are you saying that this is indistinguishable from you?