• Nate@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    A value is trying to be set on a copy of a slice from a DataFrame. Try using .loc[row_indexer,col_indexer] = value instead See the caveats in the documentation: http://pandas.pydata.org/pandas-docs/stable/indexing.html#indexing-view-versus-copy

  • Kyrgizion@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    I think SOMA made it pretty clear we’re never uploading jack shit, at best we’re making a copy for whom it’ll feel as if they’ve been uploaded, but the original remains behind as well.

    • Dasnap@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      A lot of people don’t realize that a ‘cut & paste’ is actually a ‘copy & delete’.

      And guess what ‘deleting’ is in a consciousness upload?

      • Aria@lemmygrad.ml
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        That’s actually not true. When you copy/paste a file on your computer (for most computers), it’s much faster than copying the file. Deleting the file is also not instant, so copy and delete should be the slowest of the three operations.

        When you cut and paste a file, you’re just renaming the file or updating the file database. It’s different how that works depending on your file system, but it typically never involves rewriting much of the data of the file.

        • dev_null@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          Only if you copy and paste to the same disk. When copy pasting to a different disk, as any consciousness transfer would entail, it is very much actually copied and actually removed (from the index).

        • devraza@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          when you copy/paste a file on your computer it’s much faster than copying the file

          I think you meant ‘when you cut/paste a file’?

    • TheYang@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      I wonder how you ever could “upload” a consciousness without Ship-of-Theseusing a Brain.

      Cyberpunk2077 also has this “upload vs copy” issue, but doesn’t actually make you think about it too hard.

      • rwhitisissle@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        You would have to functionally duplicate the exact structure of the brain or its consciousness while having the duplication mechanism destroy the thing it was reading at almost exactly the same time. And even then, that’s not really solving the issue.

        • AEsheron@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          5 months ago

          I don’t see an issue with that. A prolonged brain surgery that meticulously replaces each part with a mechanical equivalent in sequence. Could probably remain conscious the whole time.

          • rwhitisissle@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            5 months ago

            Yeah, but it’s still a Ship of Theseus problem. If you have a ship and replace every single board or plank with a different one, piece by piece, is it still the same ship or a completely different one, albeit an exact replica of the original. It’s important because of philosophical ideas around the existence of the soul and authenticity of the individual and a bunch of other thought-experimenty stuff.

            • AEsheron@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              5 months ago

              I think so long as you maintain consciousness that issue is fairly null in this particular circumstance. There’s lots of tolerance for changes in thought while maintaining the same self, see many brain damage victims. So long as there is minimal change in personality, there are lots of other circumstances that have a stronger case for killing one person and having a new person replace them due to change of consciousness, imo, I don’t think most people would consider a brain damaged person killed and replaced by a new consciousness, or a drug addiction with radically altered brain chemistry, etc.

      • KazuyaDarklight@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        That’s what I’ve always thought more or less, to have a chance you would need a method where mental processing starts to be shared in both, then transfers more and more to the inorganic platform till it’s 100% and the organic isn’t working anymore.

      • someacnt_@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        Yeah, like replacing individual braincells with more durable mechanisms. Idk, maybe they would be cellular as well. …that makes me wonder, maybe it is possible to transfer consciousness even with traditional biological mechanism?

    • highsight@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Ahh, but here’s the question. Who are you? The you who did the upload, or the you that got uploaded, retaining the memories of everything you did before the upload? Go on, flip that coin.

      • Kyrgizion@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        If you are the version doing the upload, you’re staying behind. The other “you” pops into existence feeling as if THEY are the original, so from their perspective, it’s as if they won the coin flip.

        But the original CANNOT win that coinflip…

      • dev_null@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        5 months ago

        I was just annoyed at the protagonist for expecting anything else. The exact same thing already happened 2 times to the protagonist (initial copy at beginning of the game, then move to the other suit). Plus it’s reinforced in the found notes for good measure. So by the ending, the player knows exactly what’s going to happen and so should the protagonist, but somehow he’s surprised.

        • Azzk1kr@feddit.nl
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          Yeah true. But Catherine said it perfectly at the end. Something like “you still don’t get it? What did you expect?”. The fact that one of his consciousness remains down in the abyss was kind of frightening. All by himself.

          • dev_null@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            5 months ago

            Two actually. The one from the before the suit change is also left there, and Catherine said he will wake up in a day or two. Maybe they can meet up actually.

  • Digital Mark@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    It’s still a surviving working copy. “I” go away and reboot every time I fall asleep.

    • jkrtn@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      Why would you want a simulation version? You will get saved at “well rested.” It will be an infinite loop of put to work for several hours and then deleted. You won’t even experience that much, your consciousness is gone.

      • Digital Mark@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        Joke’s on them, I’ve never been “well rested” in my life or my digital afterlife.

  • waigl@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    In a language that has exceptions, there is no good reason to return bool here…

  • Nobody@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    You see, with Effective Altruism, we’ll destroy the world around us to serve a small cadre of ubermensch tech bros, who will then somehow in the next few centuries go into space and put supercomputers on other planets that run simulations of people. You might actually be in one of those simulations right now, so be grateful.

    We are very smart and not just reckless, over-indulged douchebags who jerk off to the smell of our own farts.

  • Clent@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    It would be easier to record than upload. Since upload requires at least a decode steps. Given the fleeting nature of existence how does one confirm the decoding? This also requires we create a simulated brain, which seems more difficult and resource intensive than forming a new biological brain remotely connected to your nervous system inputs.

    Recording all inputs in real time and play them back across a blank nervous system will create an active copy. The inputs can be saved so they can be played back later in case of clone failure. As long as the inputs are record until the moment of death, the copy will be you minus the death so you wouldn’t be aware you’re a copy. Attach it to fresh body and off you go.

    Failure mode would take your literal lifetime to reform your consciousness but what’s a couple decades to an immortal.

    We already have the program to create new brains. It’s in our DNA. A true senior developer knows better than to try and replicate black box code that’s been executing fine. We don’t even understand consciousness enough to pretend we’re going to add new features so why waste the effort creating a parallel system of a black box.

    Scheduled reboots of a black box system is common practice. Why pretend we’re capable of skipping steps.

  • xantoxis@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    So, I’m curious.

    What do you think happens in the infinite loop that “runs you” moment to moment? Passing the same instance of consciousness to itself, over and over?

    Consciousness isn’t an instance. It isn’t static, it’s a constantly self-modifying waveform that remembers bits about its former self from moment to moment.

    You can upload it without destroying the original if you can find a way for it to meaningfully interact with processing architecture and media that are digital in nature; and if you can do that without shutting you off. Here’s the kinky part: We can already do this. You can make a device that takes a brain signal and stimulates a remote device; and you can stimulate a brain with a digital signal. Set it up for feedback in a manner similar to the ongoing continuous feedback of our neural structures and you have now extended yourself into a digital device in a meaningful way.

    Then you just keep adding to that architecture gradually, and gradually peeling away redundant bits of the original brain hardware, until most or all of you is being kept alive in the digital device instead of the meat body. To you, it’s continuous and it’s still you on the other end. Tada, consciousness uploaded.

  • ☆ Yσɠƚԋσʂ ☆@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    I think that really depends on the implementation details. For example, consider a thought experiment where an artificial neurons can be created that behave just the same as biological ones. Then each of your neurons is replaced by an artificial version while you are still conscious. You wouldn’t noticing losing a single neuron at a time, in fact this regularly happens already. Yet, over time, all your biological neurons could be replaced by artificial ones at which point your consciousness will have migrated to a new substrate.

    Alternatively, what if one of your hemispheres was replaced by an artificial one. What if an artificial hemisphere was added into the mix in addition to the two you have. What if a dozen artificial hemispheres were added, or a thousand, would the two original biological ones still be the most relevant parts of you?

  • Matriks404@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    What if every part of my body is replaced by computer part continously. At what point do I lose my consciousness?

    I think this question is hard to answer because not everyone agrees what consciousness even is.

    • hperrin@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      It wouldn’t really matter until you get to the brain. Very little of your body’s “processing” happens outside of your brain. Basically all of your consciousness is in there. There are some quick nerve paths that loop through your spine for things like moving your hand away when you touch a hot object, but that’s not really consciousness.