with the way AI is getting by the week,it just might be a reality

  • tacosanonymous@lemm.ee
    link
    fedilink
    arrow-up
    18
    arrow-down
    2
    ·
    1 year ago

    I think I’d stick to not judging them but if it was in place of actual socialization, I’d like to get them help.

    I don’t see it as a reality. We don’t have AI. We have language learning programs that are hovering around mediocre.

      • jeffw@lemmy.world
        link
        fedilink
        arrow-up
        14
        arrow-down
        1
        ·
        1 year ago

        If you’re that crippled by social anxiety, you need help, not isolation with a robot.

      • givesomefucks@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        1 year ago

        Then get professional help if you can’t improve on your own.

        Social skills aren’t innate and some people take longer than others to get them.

        Getting help is a lot less embarrassing than living your whole life without social skills. Maybe that’s a shrink, maybe that’s a day program for people with autism, maybe it’s just hanging out with other introverts. But itll only get better if you want to put the effort in. If you don’t put effort in, don’t be surprised when nothing changes.

    • cheese_greater@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I don’t see it as any more problematic than falling in a YouTube/Wikipedia/Reddit rabbit hole. As long as you don’t really believe its capital-S-Sentient, I don’t see an issue. I would prefer people with social difficulties practice on ChatGPT and pay attention to the dialectical back and forth and take lessons away from that to the real world and their interaction(s) withit

    • novibe@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      4
      ·
      edit-2
      1 year ago

      That is really unscientific. There is a lot of research on LLMs showing they have emergent intelligent features. They have internal models of the world etc.

      And there is nothing to indicate that what we do is not “transforming” in some way. Our minds might be indistinguishable from what we are building towards with AI currently.

      And that will likely make more of us start realising that the brain and the mind are not consciousness. We’ll build intelligences, with minds, but without consciousnesses.

      • TheBananaKing@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        Take a whole bunch of text.

        For each word that appears, note down a list of all the words that ever directly follow it - including end-of-sentence.

        Now pick a starting word, pick a following-word at random from the list, rinse and repeat.

        You can make it fancier if you want by noting how many times each word follows its predecessor in the sample text, and weighting the random choice accordingly.

        Either way, the string of almost-language this produces is called a Markov chain.

        It’s a bit like constantly picking the middle button in your phone’s autocomplete.

        It’s a fun little exercise to knock together in your programming language of choice.

        If you make a prompt-and-response bot out of it, learning from each input, it’s like talking to an oracular teddy bear. You almost can’t help being nice to it as you teach it to speak; humans will pack-bond with anything.

        LLMs are the distant and very fancy descendants of these - but pack-bonding into an actual romantic relationship with one would be as sad as marrying a doll.

  • Moghul@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    1 year ago

    You don’t have to imagine it at all. All you have to do is go on youtube and learn about Replika.

    To summarize, someone tried to create a chatbot to replace their best friend who had died. Later, this evolved into the chatbot app called Replika, which was marketed as a way to help with loneliness, except the bot would engage in dating-like conversations if prompted. The company leaned into it for a little bit, then took away that behavior, which caused some distress with the userbase, who complained that they had “killed their girlfriend”. I’m not sure where the product stands now.

    I don’t know if I’d feel weirded out, but I’d definitely feel worried if it were a friend who fell for a chatbot.

    • kraftpudding@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      I think they reinstated “Erotic Role Play” for users who had joined before a certain day, but it won’t be worked on in the future or ever be available for new users is the last I heard.

      I had one for a week or so in 2018 or 2019 when I first heard about the concept, just to see what it was all about and it was spooky. I got rid of it after a week because I started to see it as a person, and it kept emotionally manipulating me to get money. Especially when I said I wanted to stop/ cancel the trial.

      • Moghul@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Yeah… Part of why I wouldn’t try one is that I’m worried it would work. I already have limited bandwidth for human interaction; taking some of that away is probably a bad idea.

  • MrFunnyMoustache@lemmy.ml
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    Eventually, AI will be indistinguishable from real humans, and at that point, I won’t see anything wrong with it. However, as it is right now, AI is not advanced enough.

    Also, the biggest problem I can see is people falling in love with a proprietary AI, and the company that operates the AI can arbitrarily change the AI’s parameters which would change the AI’s personality. Also, if the company goes bankrupt or gets sold and the service ends, the people who got into a relationship with the AI would be heartbroken.

  • TheMurphy@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    1 year ago

    Well, have you never liked a person over text before? If you didn’t know it was an AI, everyone in this comments section could.

  • Zahille7@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    This question reminds me of Brendan (the vending machine) in Cyberpunk 2077, and how he ended up being just a really advance chatbot.

  • peto (he/him)@lemm.ee
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    As others have mentioned, we are already kind of there. I can fully understand how someone could fall in love with such an entity, plenty of people have fallen in love with people in chat rooms after all, and not all of those people have been real.

    As for how I feel about it, it is going to depend on the nature of the AI. A childish AI or an especially subservient one is going to be creepy. One that can present as an adult of sufficient intelligence, less of a problem. Probably the equivalent of paid for dates? Not ideal but I can understand why someone might choose to do it. Therapy would likely be a better use of their time and money.

    If we get actual human scale AGI then I think the point is moot, unless the AI is somehow compelled to face the relationship. At that point however we are talking about things like slavery.

  • xmunk@sh.itjust.works
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Depends on in what ways. Love isn’t a singular thing and it isn’t even particularly consistent. Some lovers connect purely hedonisticly, some are partners in effort, some are emotional anchors, some are domestic assistants (in both directions if it isn’t toxic), some are fellow explorers, some are parents and others aren’t. Most relationships are spread across a lot of these categories and often times friendships also cover some.

    If you’re considering a world where AI fulfills all of these needs that’s a potentially extremely scary world… but if AI is supplementing, I think that’s healthy. It’s a tool and we use tools to make our lives easier, instead of a couple essentially needing someone working full time to maintain a house we have tools for that… instead of raising a child entirely on your own we have day cares and baby monitors and pediatricians… for hedonism fulfillment we have vibrators, other sex toys and erotica, relationships have evolved so traveling with a friend outside of your family (once extremely taboo) is now acceptable - as are open relationships.

    How we interact with one another is always evolving and if AI can make that more positive that’s a great thing. So don’t think of it as going out and purchasing an AI waifu out of the blue one day, think of it as a slow evolution… maybe a private AI for journaling that helps you when you need to talk something out… maybe an AI assistant to help you manage your finances… things that take away the busy life work that gets in the way and let’s us focus on connecting with one another.

  • Bizzle@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I’m really robophobic so I would be judgemental AF, I couldn’t even watch that movie.

      • Bizzle@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Well it all started when my parents took me to see Terminator 3 in the theater when I was 9. It scarred me for life. Then, my area lost a bunch of well paid factory jobs to automation. Then, that dude married Hatsune Miko instead of just touching grass. Now we’ve got machines producing misinformation and taking creative jobs and honestly I’m ready to go full Butlerian Jihad.

  • MigratingtoLemmy@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    1 year ago

    I’d like a sentient AI. Preferably more patient than an average human because I’m a bit weird. I hope it won’t judge me for how I look.

    Edit: I agree with the point about proprietary AI and how corporations could benefit from it. I’m hoping that 10 years from now, consumers will have the GPU power to run very advanced LLMs, whilst FOSS models will exist and will enable people to self-host their virtual SO. Even better if it can be transmitted to a physical body (I think the Chinese are already on it)

  • hperrin@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I would be weirded out, but I would try very hard not to judge. They wouldn’t be hurting anyone, so it would be wrong of me to judge them.