Assuming our simulation is not designed to auto-scale (and our Admins don’t know how to download more RAM), what kind of side effects could we see in the world if the underlying system hosting our simulation began running out of resources?

  • Brkdncr@lemmy.world
    link
    fedilink
    arrow-up
    27
    arrow-down
    1
    ·
    10 months ago

    We can see that already when something approaches the speed of light: time slows down for it.

      • Random Dent@lemmy.ml
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        I have a running theory that that’s also what’s going on with quantum physics, because I understand it so poorly that it just seems like nonsense to me. So in my head, I see it as us getting into some sort of source code we’re not supposed to see, and on the other side some programmers are going “fuck I don’t know, just make it be both things at once!” and making it up on the fly.

  • 𝘋𝘪𝘳𝘬@lemmy.ml
    link
    fedilink
    arrow-up
    12
    ·
    10 months ago

    An automatic purge process will start to prevent this. It happened several times in the past. Last time between 2019-2022. It removed circa 7 million processes. With regular purges like this it is made sure that the resources are not maxed out before the admins can add more capacity.

  • flashgnash@lemm.ee
    link
    fedilink
    arrow-up
    9
    ·
    10 months ago

    If our entire universe is a simulation so are our laws of physics, in the parent universe running our simulation the universe might be powered by pure imagination and the concept of memory or CPU cycles or even electricity might not even exist

  • ProfessorProteus@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    10 months ago

    These answers are all really fun but I didn’t see anyone point out one thing: why should we assume that our creators’ “computer” architecture is anything remotely similar to our technology? I’m thinking of something like SETI—We can’t just assume that all other life is carbon-based (though evidently it’s a pretty good criterion). The simulation could be running on some kind of dark matter machine or some other exotic material that we don’t even know about.

    Personally I don’t subscribe to the simulation theory. But if it were true, why would the system have any kind of limitation? I feel like if it can simulate everything from galactic superclusters down to strings vibrating in Planck Time, there are effectively no limits.

    Then again, infinity is quite a monster, so what do I know?

  • kerrigan778@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    10 months ago

    Have you ever noticed when you look into a telescope that it takes a little bit to position yourself right to see what you’re looking at? And it seems like you used to be able to do it a lot faster? That’s not age, that’s actually lag time added to cover decompressing the data.

  • fidodo@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    10 months ago

    That would only be a problem if you need dynamically allocated memory. It could be a statically allocated simulation where every atom is accounted for.

    • Seasoned_Greetings@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      10 months ago

      Given the whole “information can neither be created nor destroyed” aspect of atomic physics, taken literally, this theory checks out.

  • SolidGrue@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    10 months ago

    That’s why history repeats itself. It’s doing that more frequently these days because there’s more people remembering more things.

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    10 months ago

    The assumption that it isn’t designed around memory constraints isn’t reasonable.

    We have limits on speed so you can’t go too fast leading to pop in.

    As you speed up the slower things move so there needs to be less processing in spite of more stuff (kind of like a frame rate drop but with a fixed number of frames produced).

    As you get closer to more dense collections of stuff the same thing happens.

    And even at the lowest levels, the conversion from a generative function to discrete units to track stateful interactions discards the discrete units if the permanent information about the interaction was erased, indicative of low level optimizations.

    The scale is unbelievable, but it’s very memory considerate.

  • Sentient Loom@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    Why would we run out of RAM? Is there new matter being created? It’s not like we’re storing anything. We will keep using the same resources.

    • Grimy@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      New human instances are being created, and as our society’s general education keeps going up, they demand more processing power.

      As our tech goes up, this has to be simulated as well. Not only things like telescopes and the LHC, but your computer who’s running a game world doesn’t actually exists and it’s the super computer who’s running it.

      Obviously, this is just a drop in the bucket for an entity that can make a fully simulated universe but the situation quickly becomes untenable if we start creating hyper advanced simulation as well, we are maybe only a few decades away.

      • Blue_Morpho@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        As our tech goes up, this has to be simulated as well

        Everything is made up of atoms/photons/etc. If every particle is tracked for all interactions, it doesn’t matter how those particles are arranged, it’s always the same memory.

        • Grimy@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          10 months ago

          Atoms and photons wouldn’t actually exist, they would be generated whenever we measure things at that level.

          Obviously, there’s many ways to interpret what kind of simulation it would be. A full simulation from the big band is fun but doesn’t make for good conversation since it would be indistinguishable from reality.

          I was thinking more of a video game like simulation, where the sim doesn’t render things it doesn’t need to.

          • Blue_Morpho@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            10 months ago

            where the sim doesn’t render things it doesn’t need to.

            That can’t work unless it’s a simulation made personally for you.

            • Grimy@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              10 months ago

              I don’t follow. If there are others it would render for them just as much as me. I’m saying it wouldn’t need to render at an automic level except for the few that are actively measuring at that level.

              • Blue_Morpho@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                10 months ago

                Everything interacting is “measuring” at that level. If the quantum levels weren’t being calculated correctly all the time for you, the LEDs in your smartphone would flicker. All those microscopic effects cause the macroscopic effects we observe.

                • Grimy@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  10 months ago

                  If it was a simulation, there would be no need to go that far. We simulate physics without simulating the individual atoms.

                  None of it would be real, the microscopic effects would just be approximated unless a precise measurement tool would be used and then they would be properly simulated.

                  We wouldn’t know the difference.

  • hightrix@lemmy.world
    link
    fedilink
    arrow-up
    4
    arrow-down
    2
    ·
    edit-2
    10 months ago

    Who is to say that the sim needs ram. What if it were just a giant state machine where the current state only depends on the previous state. And the entire universe is the “ram”.

  • espentan@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    10 months ago

    Who knows… maybe we’ll experience pointless wars and massive inequality… selfish douchebags who only care about bolstering their ego might gain power… heck, maybe even the climate will slowly start changing for the worse.