Assuming our simulation is not designed to auto-scale (and our Admins don’t know how to download more RAM), what kind of side effects could we see in the world if the underlying system hosting our simulation began running out of resources?

  • Pons_Aelius@kbin.social
    link
    fedilink
    arrow-up
    69
    ·
    10 months ago

    Simply put.

    We wouldn’t notice anything.

    Our perception of the world would be based only on the compute cycles and not on any external time-frame.

    The machine could run at a Million Billion hertz or at one clock-cycle per century and your perception of time inside the machine would be the same.

    Same with low ram, we would have no indication if we were constantly being paged out to a hard drive and written back to ram as required.

    Greg Egan gave a great explanation of this in the opening chapter of his Novel Permutation City

    • Feyr@lemmy.world
      link
      fedilink
      arrow-up
      10
      arrow-down
      2
      ·
      10 months ago

      Clearly wrong .

      Running out of ram happen all the time. We see something, store it, and that something also gets stored in ram. But if that second storage gets reaped by the oom, the universe reprocess it.

      Since it’s already in our copy, it cause weird issues. We call it Déjà Vu!

  • Brkdncr@lemmy.world
    link
    fedilink
    arrow-up
    51
    arrow-down
    1
    ·
    10 months ago

    We can see that already when something approaches the speed of light: time slows down for it.

      • Random Dent@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        I have a running theory that that’s also what’s going on with quantum physics, because I understand it so poorly that it just seems like nonsense to me. So in my head, I see it as us getting into some sort of source code we’re not supposed to see, and on the other side some programmers are going “fuck I don’t know, just make it be both things at once!” and making it up on the fly.

  • 𝘋𝘪𝘳𝘬@lemmy.ml
    link
    fedilink
    arrow-up
    27
    ·
    10 months ago

    An automatic purge process will start to prevent this. It happened several times in the past. Last time between 2019-2022. It removed circa 7 million processes. With regular purges like this it is made sure that the resources are not maxed out before the admins can add more capacity.

  • degen@midwest.social
    link
    fedilink
    English
    arrow-up
    26
    ·
    10 months ago

    Data in memory will be offloaded to swap space. I doubt we’d notice any fluctuations since we’re part of the simulation, but externally it could slow to a crawl and basically be useless. They might shut it down, hopefully just to refactor. But again we probably wouldn’t notice any downtime, even if it’s permanent.

      • andyburke@fedia.io
        link
        fedilink
        arrow-up
        6
        ·
        10 months ago

        Not sure you’ve experienced the end of many SimCity games if you think this is the case. 😂

        If anything, the earth lately kinda feels like someone’s gotten bored with the game.

    • Bitrot@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      iirc this is a plot point in the book “Fall; or, Dodge in Hell” by Neal Stephenson (sequel to Reamde). At some point the virtual world slows to a crawl so much that people outside of it cannot really track what is going on but it’s transparent to those inside the world. I might be misremembering exactly how it was implemented.

  • darkpanda@lemmy.ca
    link
    fedilink
    arrow-up
    18
    ·
    10 months ago

    Maybe we’re already there and death is just the garbage collector freeing up more space.

  • flashgnash@lemm.ee
    link
    fedilink
    arrow-up
    16
    ·
    10 months ago

    If our entire universe is a simulation so are our laws of physics, in the parent universe running our simulation the universe might be powered by pure imagination and the concept of memory or CPU cycles or even electricity might not even exist

  • ProfessorProteus@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    10 months ago

    These answers are all really fun but I didn’t see anyone point out one thing: why should we assume that our creators’ “computer” architecture is anything remotely similar to our technology? I’m thinking of something like SETI—We can’t just assume that all other life is carbon-based (though evidently it’s a pretty good criterion). The simulation could be running on some kind of dark matter machine or some other exotic material that we don’t even know about.

    Personally I don’t subscribe to the simulation theory. But if it were true, why would the system have any kind of limitation? I feel like if it can simulate everything from galactic superclusters down to strings vibrating in Planck Time, there are effectively no limits.

    Then again, infinity is quite a monster, so what do I know?

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      all other life is carbon-based (though evidently it’s a pretty good criterion)

      The short version is that the only other element that allows 4 covalent bonds is silicon, but nobody has been able to find a solvent that allows complex silicon-based molecules to form without instantly dissolving any structures they form.

      • ProfessorProteus@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        I remember reading about how silicon is theoretically possible, but I had (erroneously) assumed there were more potential candidates. Thanks for the additional info. This stuff is so fascinating!

  • kerrigan778@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    10 months ago

    Have you ever noticed when you look into a telescope that it takes a little bit to position yourself right to see what you’re looking at? And it seems like you used to be able to do it a lot faster? That’s not age, that’s actually lag time added to cover decompressing the data.

    • AnomalousBit@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      10 months ago

      I believe you are thinking in terms of a Turing-machine-like computer. I don’t think it’s possible today to “suspend” the bits in a quantum computer. I also don’t think it’s possible to know if the simulation could be paused (or even “added to” without losing its initial state).

  • fidodo@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    10 months ago

    That would only be a problem if you need dynamically allocated memory. It could be a statically allocated simulation where every atom is accounted for.

    • Seasoned_Greetings@lemm.ee
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      10 months ago

      Given the whole “information can neither be created nor destroyed” aspect of atomic physics, taken literally, this theory checks out.

  • SolidGrue@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    10 months ago

    That’s why history repeats itself. It’s doing that more frequently these days because there’s more people remembering more things.

  • HeartyBeast@kbin.social
    link
    fedilink
    arrow-up
    8
    ·
    10 months ago

    Render distance would be reduced requiring us to come up with plausible theories to account for the fact that there is a limit to the size of the so-called ‘observable universe’