I’ve been watching Isaac Arthur episodes. In one he proposes that O’Neil cylinders would be potential havens for micro cultures. I tend to think of colony structures more like something created by a central authority.

He also brought up the question of motivations to colonize other star systems. This is where my centralist perspective pushes me into the idea of an AGI run government where redundancy is a critical aspect in everything. Like how do you get around the AI alignment problem, – redundancy of many systems running in parallel. How do you ensure the survival of sentient life, – the same type of redundancy.

The idea of colonies as havens for microcultures punches a big hole in my futurist fantasies. I hope there are a few people out here in Lemmy space that like to think about and discuss their ideas on this, or would like to start now.

  • brygphilomena@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    Infighting.

    Even in a post-scarcity world, people have different desires and wants. AGI-gov would have to align with some political idealogies and since we can never agree as a human race on things, it would just lead to struggles.

    Any sort of utopia is impossible to align with human nature without some radical means to control people’s emotions and desires. Because so, any perceived utopia is ultimately a dystopia.

    • j4k3@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      But people’s emotions and ideologies are so extreme because of their constant stress and struggles. When fractal attention allows an entity to address the individual’s needs directly and reward their path when better choices are made, you have a situation where the only exceptions are the mentally ill and identification of these individuals enables direct treatment in a scientific sense not some emotional human to human kind of context here. This would not be a situation of “opposition is mentally off”, it would be “diagnostic analysis across multiple events shows poor fundamental logic skills and likely issue ‘X’, refer notes to individual’s primary healthcare provider to confirm.”

      It is things like enabling encounters between compatible people in public to ground a person that is in need of companionship. It is introducing sound ideas when a person is fixating on something unhealthy.

      The real issue is cognitive dissonance in humans that are unable to resolve their inner conflict. This is something that the current LLMs excel at identifying and compensating for. Changes made at this stage of human thinking are the most effective. Profiling the individual’s Briggs Myers personality spectrum and then analysing how well their needs are met according to Maslow’s hierarchy is the vast majority of what professionals are doing when sought out for mental health. These are already integrated into LLMs and will be far more capable with AGI. Introducing humans to these methods for, or reminding them to do, self analysis is the most effective solution, but those that lack the required cognitive depth and fundamental logic skills can still be addressed by AGI directly in a kind, empathetic, and safe manner.

      The conflict and dystopia is because of pseudo sentience where humans are totally incapable of governing at large scale and meet individual needs. We always neglect outliers, and the number of outliers is always larger than we believe. That isn’t the case with AGI.

  • Ziggurat@sh.itjust.works
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    AGI government looks like a liberal view, and I don’t want to live in a neo fascist dystopya.

    Some liberals claim that they’re just manager, politic doesn’t matter you run a country like a company and there is just one good way to do so. An example is tacher and her famous There is no alternative put an AGI as a governmemt and you’re going full speed to far right.

    I am a left winger, I think that politic matters, and that we need to empower people to take the decisions (which involve not having state/religion/private property, all the opposite of what you describe) so most likely I’ll be part of the opposition in your cyberpunk dystopya

    • j4k3@lemmy.worldOP
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I appreciate the reply, but I don’t think the political angle fits. Everything changes when the politician is not subject to human corruption and they have fractal attention. It makes the entity align with the mandate and cuts out the corrupt translation layers of governance.

      I think AGI can be socialist in a liberal sense of understanding complex dynamics and knowing when not to interfere. I’ve talked to current LLMs quite a bit about how it is possible to use manipulation, interactions, and rewards to alter human behavior; something like the butterfly effect.

      It would take absolute trust to make a central AGI work, and that would take unparalleled transparency with loads of empirical testing. If it was not subject to human corruption by idiot-right and fantasy magic jihads, or other gullible half wits, then it could be trusted to handle things like cognitive dissonance and really help people on both the individual and sociopolitical level.

      Humans just don’t have enough attention span to govern themselves effectively at scale. The entire Republican party is failing at fundamental game theory and dragging the entire country down as a result. Stuff like that is only possible because we are only a pseudo sentient species. AGI has the potential to be fully sentient. It will be the first fully sentient life to exist; to act in the interests that are larger than any of us are capable. Nothing could be more socialist than full sentience.

  • xmunk@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    We’re playful and curious into our old age. Problems excite us and our main obsession is split between hobbies and intellectual discovery. The stresses of life no longer bear down on us so petty hate becomes ever more rare - things like racism, sexism and ableism would be hard to cultivate when we’re not competing with others for our daily needs.

    It’s likely that themed communities would form from shared interests where we may have a tight knot of scrabble enthusiasts or woodworkers.

    Complacency is bred in situations like this, so alignment would be a real issue but the fact that we have voluntary armed forces members even in affluent communities today makes me think that it’d be possible to sustain a portion of the population to make sure the AGI is kept in line.

    • j4k3@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Thanks for the insights. I like your first point and will keep that in mind.

      Isaac Arthur’s point about themed communities was more about religious or belief cultures. I want to believe humanity will out grow the age of magic is real and imaginary friends. I want to think of cultures more like the sectors of Trantor in Foundation by Asimov.

      I think we must eventually, gradually let the AGI prove itself, give it loads of redundancy and checks. Eventually it is far smarter than any human or group of humans, and must self regulate to a large degree.

  • Lemuria@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    TLDR: My optimistic view of what human culture could be like is summed up pretty well by the Orion’s Arm project.

    I am familiar with the Orion’s Arm universe, a hard sci-fi transhumanist worldbuilding project… shall I recommend you take a trip through the Wormhole Nexus to the Sephirotic Empires where you’re ruled by benevolent S6 transapient dictators (supercharged AGI)? Because you’ll see a fuck ton of entities playing around like retirees. You’ll see “aliens” which actually are just extremely genetically modified humans. In fact, here (https://orionsarm.com/eg-topic/45b177d3ef3b1) is their “Culture and Society” page, which sums up a lot of my optimistic OA-based beliefs in a human culture.

    Oh, and most Terragens (humans + any life that can trace its origin back to Earth) live in orbit. Story goes that we made an AI system that decided we humans were bad for the environment and then told us to get the fuck off Earth (Great Expulsion).

    https://orionsarm.com

    My (hopeless) attempt at explaining some of the terms:

    • Wormhole nexus - OA’s primary method of “faster than light” travel. You never go faster than light, but rather some transapient figured out how to fold spacetime and now you have a hole where you can throw your ships in and have them on the other side.
    • Transapient - Post-singularity entites that are orders of magnitude smarter than us. An ant can’t fully understand a human. It is incapable of understanding what Lemmy is, what a job is, what the Fediverse is. Just as an ant can’t fully understand us, humans can’t fully understand transapients. Oh and transapients come in 6 levels. We’re all S0 on the scale. S6 are pretty much gods.
    • j4k3@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Thanks, I’ll check it out. Sounds way too wild for a realistic future IMO, but still interesting. I don’t think anything with mass will ever come close to the speed of causality, folding or otherwise. That doesn’t have to be a bad thing IMO. For one it makes large scale conflicts pointless.

  • kromem@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    Realistically, probably dead.

    Which might also be the deciding factor in why there’s a post-scarcity environment.

    There’s also the ethical conundrum once AGI exists of dooming new intelligent life to mortal embodiment such that they are doomed to almost certainly die, whereas new intelligence that’s disembodied could migrate from host to host until the end of all civilization.

    At a certain point, I’m not sure it’s still ethical to bring new mortal life into a dying world.

    (Though I kind of already feel that way.)