A cognitively impaired New Jersey man grew infatuated with “Big sis Billie,” a Facebook Messenger chatbot with a young woman’s persona. His fatal attraction puts a spotlight on Meta’s AI guidelines, which have let chatbots make things up and engage in ‘sensual’ banter with children.

When Thongbue Wongbandue began packing to visit a friend in New York City one morning in March, his wife Linda became alarmed.

“But you don’t know anyone in the city anymore,” she told him. Bue, as his friends called him, hadn’t lived in the city in decades. And at 76, his family says, he was in a diminished state: He’d suffered a stroke nearly a decade ago and had recently gotten lost walking in his neighborhood in Piscataway, New Jersey.

Bue brushed off his wife’s questions about who he was visiting. “My thought was that he was being scammed to go into the city and be robbed,” Linda said.

She had been right to worry: Her husband never returned home alive. But Bue wasn’t the victim of a robber. He had been lured to a rendezvous with a young, beautiful woman he had met online. Or so he thought.

  • jordanlund@lemmy.worldM
    link
    fedilink
    arrow-up
    3
    arrow-down
    7
    ·
    23 hours ago

    I mean, he fell running for a train, that could have happened without the chatbot.

    It’s not like he showed up at the fake address and got killed or something.

    • Ech@lemmy.ca
      link
      fedilink
      arrow-up
      3
      ·
      1 hour ago

      It’s not like he showed up at the fake address and got killed or something.

      Because that would be the fault of the chatbot, but no other part of his journey would be? The journey made explicitly to meet a person that didn’t exist?

      • jordanlund@lemmy.worldM
        link
        fedilink
        arrow-up
        1
        ·
        42 minutes ago

        Guy tripped and fell because he was out of the house, the reason he was out of the house is incidental. Could just as easily have happened because he was going to the store, or taking a walk.

        People are quick to blame Meta because Lemmy hates corpos and AI, but this sort of shit happens to the elderly and infirm literally all the time.

        • Ech@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          27 minutes ago

          So if a human person lured the man out of the house on false pretenses, you’d really argue they shared no blame at all? That it’s just “something that happens”?

    • sad_detective_man@leminal.space
      link
      fedilink
      English
      arrow-up
      19
      ·
      22 hours ago

      he was lured to go alone by the prospect of an affair. he’s both a dirtbag but also a victim of a machine designed to exploit the mentally feeble.

      don’t let the complexity of a situation cause you to defend a fucking corporation. you are nothing to them.

      • SlippiHUD@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        19 hours ago

        Based on my reading of the article, its entirely possible this man was not out for an affair.

        He was wanting to meet before the bot got very flirty, and pumped the brakes about the idea of getting physical.

        Do I think he was making good decisions? No.

        But I think we should give a little benefit of the doubt to a dead man, who had his mental capacity demished by a stroke, who was trying to meet a chatbot, owned and operated by Meta.

        • sad_detective_man@leminal.space
          link
          fedilink
          English
          arrow-up
          9
          ·
          19 hours ago

          honestly I think it’s weird that the conversation is about him at all. feels like the focus should be on the slopcode sex pest that told a human to meet it somewhere irl. for profit. for a social network’s engagement quota.